メイン コンテンツにスキップする

Configure a simulator connector

Beta

The features described in this section are currently in beta testing and are subject to change.

Simulator connectors can be downloaded from Data management > Integrate > Extractors in Cognite Data Fusion (CDF). Then search for a connector by name; for example PROSPER.

To configure a simulator connector, you must edit the configuration file. The file is in YAML format, and the sample configuration file contains all valid options with default values. You can exclude fields entirely to let the connector use default values. The configuration file separates settings by component, and you can remove an entire component to disable it or use the default values.

You'll find configuration file examples listed under the config folder in the directory where the connector is installed.

ヒント

Don't base your configuration on the config.example.ymlfile. Instead, use config.minimal.yml as your base and copy the parts you need from config.example.yml.

Using values from environment variables

In the configuration file, values wrapped in ${} are replaced with environment variables with that name. For example, ${COGNITE_PROJECT} will be replaced with the value of the environment variable called COGNITE_PROJECT.

The configuration file also contains the global parameter version, which holds the version of the configuration schema used in the configuration file. This document describes version 1 of the configuration schema.

Tip

You can set up extraction pipelines to use versioned connector configuration files stored in the cloud.

Connector

This section contains information on how the current connector appears in Simulators.

ParameterDescription
connector.name-prefixEnter the connector name that is registered on the API.
connector.add-machine-name-suffixSet to true to have the connector name appear as name@vm-host-name.
connector.api-logger.enabledInsert boolean of whether to push logs to CDF or not. The default value is false.
connector.api-logger.levelEnter the minimum level of logs to push to CDF. Valid values are Information, Debug, Error, and Warning.

Logger

Log entries are either Fatal, Error, Warning, Information, Debug, Verbose, in order of decreasing importance. Each level covers the ones of higher importance.

ParameterDescription
consoleConfiguration for logging to the console.
console.levelSet the minimum level of log events to write to the console. Set this parameter to enable console logging.
console.stderr-levelLog events at this level or above are redirected to standard error.
fileConfiguration for logging to a rotating log file.
file.levelSet the minimum level of log events to write to file.
file.pathEnter the path to the files to be logged. For instance, if this is set to logs/log.txt, logs on the form logs/log[date].txt will be created, depending on rolling-interval.
file.retention-limitSet the maximum number of log files that are kept in the log folder.
file.rolling-intervalEnter a rolling interval for log files as day or hour. The default value is day.

Cognite

This is the configuration of the connection to CDF.

ParameterDescription
hostInsert the CDF service URL. The default value is https://api.cognitedata.com.
projectEnter the CDF project. This is a required parameter.
idp-authenticationThe configuration for authenticating to CDF.
idp-authentication.authorityThe authority used with tenant to authenticate to Azure tenants. Use token-url if you're connecting using a non-azure IdP. The default value is https://login.microsoftonline.com.
idp-authentication.tenantEnter the Azure tenant used with authority.
idp-authentication.token-urlInsert the URL used to obtain service tokens, used for non-azure IdPs.
idp-authentication.client-idEnter the service principal client ID.
idp-authentication.secretEnter the service principal client secret.
idp-authentication.resourceInsert an optional resource parameter to pass along with token requests.
idp-authentication.scopesEnter a list of scopes to pass along with the request. This will typically need to contain [host]/.default.
idp-authentication.audienceInsert an optional audience parameter to pass along with token requests.
idp-authentication.min-ttlSet the requested minimum time-to-live in seconds for the token.
idp-authentication.certificateConfiguration for authenticating using a client certificate.
idp-authentication.certificate.authority-urlInsert the certificate authority URL.
idp-authentication.certificate.pathEnter the path to the .pem or .pfx certificate to be used for authentication.
idp-authentication.certificate.passwordEnter the certificate password.
max-upload-intervalSet the maximum time to cache data points before they are uploaded to CDF. The syntax is described in Timestamps and intervals. The default value is 1s.
max-data-points-upload-queue-sizeSet the maximum number of cached data points before they are uploaded to CDF. The default value is 1000000.
cdf-retriesSet the number of automatic retries on requests to CDF.
cdf-retries.timeoutSet a timeout in milliseconds for each individual try. The default value is 80000.
cdf-retries.max-retriesSet the maximum number of retries. A value less than 0 retries forever.
cdf-retries.max-delaySet the maximum delay between each try in milliseconds. The base delay is calculated according to 125 * 2 ^ retry milliseconds. If this is less than 0, there's no upper limit. The default value is 5000.
cdf-chunkingThe configuration for chunking on requests to CDF. Note that increasing these may cause requests to fail, due to limits in the API. Read the API documentation before making these higher than their current value.
cdf-chunking.time-seriesSet the maximum number of time series per get/create time series request.
cdf-chunking.assetsSet the maximum number of assets per get/create assets request.
cdf-chunking.data-point-time-seriesSet the maximum number of time series per data point create request.
cdf-chunking.data-pointsSet the maximum number of data points per data point create request.
cdf-throttlingSet how requests to CDF should be throttled.
cdf-throttling.time-seriesSet the maximum number of parallel requests per time series operation. The default value is 20.
cdf-throttling.assetsSet the maximum number of parallel requests per assets operation. The default value is 20.
cdf-throttling.data-pointsSet the maximum number of parallel requests per datapoints operation. The default value is 10.
sdk-loggingThe configuration for logging of requests from the SDK.
sdk-logging.disableSet this to true to disable logging of requests from the SDK. This is by default enabled.
sdk-logging.levelSet the log level to log messages from the SDK. The default value is debug.
sdk-logging.formatSet the log message format. The default value is CDF ({Message}): {HttpMethod} {Url} - {Elapsed} ms
nan-replacementReplacement for NaN values when writing to CDF. The default value is none, meaning these values are removed.
extraction-pipeline.external-idThe configuration for associating this connector with an extraction pipeline. This is used for monitoring and remote configuration.
certificatesThe configuration for special handling of SSL certificates. This should not be considered as a permanent solution to certificate problems.
certificates.accept-allAccept all remote SSL certificates even if verification fails. Note: This introduces a risk of man-in-the-middle attacks.
certificates.allow-listList of certificate thumbprints to automatically accept. This is a smaller risk than accepting all certificates.

State store

This section includes the configuration parameters for storing state in a local database or in CDF RAW.

ParameterDescription
locationEnter the path to the database file, or name of RAW database containing state store.
databaseSelect which type of database to use. Valid options are LiteDb or None. The default value is None.
intervalSet the interval between each push of local states to the state store. The syntax is described in Timestamps and intervals. The default value is 1m.