Creates a Pathling context with the given configuration options.

pathling_connect(
  spark = NULL,
  max_nesting_level = 3,
  enable_extensions = FALSE,
  enabled_open_types = c("boolean", "code", "date", "dateTime", "decimal", "integer",
    "string", "Coding", "CodeableConcept", "Address", "Identifier", "Reference"),
  enable_terminology = TRUE,
  terminology_server_url = "https://tx.ontoserver.csiro.au/fhir",
  terminology_verbose_request_logging = FALSE,
  terminology_socket_timeout = 60000,
  max_connections_total = 32,
  max_connections_per_route = 16,
  terminology_retry_enabled = TRUE,
  terminology_retry_count = 2,
  enable_cache = TRUE,
  cache_max_entries = 2e+05,
  cache_storage_type = StorageType$MEMORY,
  cache_storage_path = NULL,
  cache_default_expiry = 600,
  cache_override_expiry = NULL,
  token_endpoint = NULL,
  enable_auth = FALSE,
  client_id = NULL,
  client_secret = NULL,
  scope = NULL,
  token_expiry_tolerance = 120,
  accept_language = NULL
)

Arguments

spark

A pre-configured SparkSession instance, use this if you need to control the way that the session is set up

max_nesting_level

Controls the maximum depth of nested element data that is encoded upon import. This affects certain elements within FHIR resources that contain recursive references, e.g., QuestionnaireResponse.item.

enable_extensions

Enables support for FHIR extensions

enabled_open_types

The list of types that are encoded within open types, such as extensions.

enable_terminology

Enables the use of terminology functions

terminology_server_url

The endpoint of a FHIR terminology service (R4) that the server can use to resolve terminology queries.

terminology_verbose_request_logging

Setting this option to TRUE will enable additional logging of the details of requests to the terminology service.

terminology_socket_timeout

The maximum period (in milliseconds) that the server should wait for incoming data from the HTTP service

max_connections_total

The maximum total number of connections for the client

max_connections_per_route

The maximum number of connections per route for the client

terminology_retry_enabled

Controls whether terminology requests that fail for possibly transient reasons should be retried

terminology_retry_count

The number of times to retry failed terminology requests

enable_cache

Set this to FALSE to disable caching of terminology requests

cache_max_entries

Sets the maximum number of entries that will be held in memory

cache_storage_type

The type of storage to use for the terminology cache

cache_storage_path

The path on disk to use for the cache

cache_default_expiry

The default expiry time for cache entries (in seconds)

cache_override_expiry

If provided, this value overrides the expiry time provided by the terminology server

token_endpoint

An OAuth2 token endpoint for use with the client credentials grant

enable_auth

Enables authentication of requests to the terminology server

client_id

A client ID for use with the client credentials grant

client_secret

A client secret for use with the client credentials grant

scope

A scope value for use with the client credentials grant

token_expiry_tolerance

The minimum number of seconds that a token should have before expiry when deciding whether to send it with a terminology request

accept_language

The default value of the Accept-Language HTTP header passed to the terminology server

Value

A Pathling context instance initialized with the specified configuration

Details

If no Spark session is provided and there is not one already present in this process, a new one will be created.

If a SparkSession is not provided, and one is already running within the current process, it will be reused.

It is assumed that the Pathling library API JAR is already on the classpath. If you are running your own cluster, make sure it is on the list of packages.

See also

Other context lifecycle functions: pathling_disconnect(), pathling_disconnect_all(), pathling_spark()

Examples

# Create PathlingContext for an existing Spark connecton.
sc <- sparklyr::spark_connect(master = "local")
pc <- pathling_connect(spark = sc)
pathling_disconnect(pc)

# Create PathlingContext with a new Spark connection.
pc <- pathling_connect()
spark <- pathling_spark(pc)
pathling_disconnect_all()