Configuration settings
To configure the DB extractor, you must create a configuration file. The file must be in YAML format.
You can set up extraction pipelines to use versioned extractor configuration files stored in the cloud.
Using values from environment variables
The configuration file allows substitutions with environment variables. For example:
cognite:
secret: ${COGNITE_CLIENT_SECRET}
will load the value from the COGNITE_CLIENT_SECRET
environment variable into the cognite/secret
parameter. You can also do string interpolation with environment variables, for example:
url: http://my-host.com/api/endpoint?secret=${MY_SECRET_TOKEN}
Implicit substitutions only work for unquoted value strings. For quoted strings, use the !env
tag to activate environment substitution:
url: !env 'http://my-host.com/api/endpoint?secret=${MY_SECRET_TOKEN}'
Using values from Azure Key Vault
The DB extractor also supports loading values from Azure Key Vault. To load a configuration value from Azure Key Vault, use the !keyvault
tag followed by the name of the secret you want to load. For example, to load the value of the my-secret-name
secret in Key Vault into a password
parameter, configure your extractor like this:
password: !keyvault my-secret-name
To use Key Vault, you also need to include the azure-keyvault
section in your configuration, with the following parameters:
Parameter | Description |
---|---|
keyvault-name | Name of Key Vault to load secrets from |
authentication-method | How to authenticate to Azure. Either default or client-secret . For default , the extractor will look at the user running the extractor, and look for pre-configured Azure logins from tools like the Azure CLI. For client-secret , the extractor will authenticate with a configured client ID/secret pair. |
client-id | Required for using the client-secret authentication method. The client ID to use when authenticating to Azure. |
secret | Required for using the client-secret authentication method. The client secret to use when authenticating to Azure. |
tenant-id | Required for using the client-secret authentication method. The tenant ID of the Key Vault in Azure. |
Example:
azure-keyvault:
keyvault-name: my-keyvault-name
authentication-method: client-secret
tenant-id: 6f3f324e-5bfc-4f12-9abe-22ac56e2e648
client-id: 6b4cc73e-ee58-4b61-ba43-83c4ba639be6
secret: 1234abcd
Base configuration object
Parameter | Type | Description |
---|---|---|
version | either string or integer | Configuration file version |
type | either local or remote | Configuration file type. Either local , meaning the full config is loaded from this file, or remote , which means that only the cognite section is loaded from this file, and the rest is loaded from extraction pipelines. Default value is local . |
cognite | object | The cognite section describes which CDF project the extractor will load data into and how to connect to the project. |
logger | object | The optional logger section sets up logging to a console and files. |
metrics | object | The metrics section describes where to send metrics on extractor performance for remote monitoring of the extractor. We recommend sending metrics to a Prometheus pushgateway, but you can also send metrics as time series in the CDF project. |
queries | list | List of queries to execute |
databases | list | List of databases to connect to |
extractor | object | General extractor configuration |
cognite
Global parameter.
The cognite section describes which CDF project the extractor will load data into and how to connect to the project.
Parameter | Type | Description |
---|---|---|
project | string | Insert the CDF project name. |
idp-authentication | object | The idp-authentication section enables the extractor to authenticate to CDF using an external identity provider (IdP), such as Microsoft Entra ID (formerly Azure Active Directory). |
data-set | object | Enter a data set the extractor should write data into |
extraction-pipeline | object | Enter the extraction pipeline used for remote config and reporting statuses |
host | string | Insert the base URL of the CDF project. Default value is https://api.cognitedata.com . |
timeout | integer | Enter the timeout on requests to CDF, in seconds. Default value is 30 . |
external-id-prefix | string | Prefix on external ID used when creating CDF resources |
connection | object | Configure network connection details |
idp-authentication
Part of cognite
configuration.
The idp-authentication
section enables the extractor to authenticate to CDF using an external identity provider (IdP), such as Microsoft Entra ID (formerly Azure Active Directory).
Parameter | Type | Description |
---|---|---|
authority | string | Insert the authority together with tenant to authenticate against Azure tenants. Default value is https://login.microsoftonline.com/ . |
client-id | string | Required. Enter the service principal client id from the IdP. |
tenant | string | Enter the Azure tenant. |
token-url | string | Insert the URL to fetch tokens from. |
secret | string | Enter the service principal client secret from the IdP. |
resource | string | Resource parameter passed along with token requests. |
audience | string | Audience parameter passed along with token requests. |
scopes | list | Enter a list of scopes requested for the token |
min-ttl | integer | Insert the minimum time in seconds a token will be valid. If the cached token expires in less than min-ttl seconds, it will be refreshed even if it is still valid. Default value is 30 . |
certificate | object | Authenticate with a client certificate |
scopes
Part of idp-authentication
configuration.
Enter a list of scopes requested for the token
Each element of this list should be a string.
certificate
Part of idp-authentication
configuration.
Authenticate with a client certificate
Parameter | Type | Description |
---|---|---|
authority-url | string | Authentication authority URL |
path | string | Required. Enter the path to the .pem or .pfx certificate to be used for authentication |
password | string | Enter the password for the key file, if it is encrypted. |
data-set
Part of cognite
configuration.
Enter a data set the extractor should write data into
Parameter | Type | Description |
---|---|---|
id | integer | Resource internal id |
external-id | string | Resource external id |
extraction-pipeline
Part of cognite
configuration.
Enter the extraction pipeline used for remote config and reporting statuses
Parameter | Type | Description |
---|---|---|
id | integer | Resource internal id |
external-id | string | Resource external id |
connection
Part of cognite
configuration.
Configure network connection details
Parameter | Type | Description |
---|---|---|
disable-gzip | boolean | Whether or not to disable gzipping of json bodies. |
status-forcelist | string | HTTP status codes to retry. Defaults to 429, 502, 503 and 504 |
max-retries | integer | Max number of retries on a given http request. Default value is 10 . |
max-retries-connect | integer | Max number of retries on connection errors. Default value is 3 . |
max-retry-backoff | integer | Retry strategy employs exponential backoff. This parameter sets a max on the amount of backoff after any request failure. Default value is 30 . |
max-connection-pool-size | integer | The maximum number of connections which will be kept in the SDKs connection pool. Default value is 50 . |
disable-ssl | boolean | Whether or not to disable SSL verification. |
proxies | object | Dictionary mapping from protocol to url. |
proxies
Part of connection
configuration.
Dictionary mapping from protocol to url.
logger
Global parameter.
The optional logger
section sets up logging to a console and files.
Parameter | Type | Description |
---|---|---|
console | object | Include the console section to enable logging to a standard output, such as a terminal window. |
file | object | Include the file section to enable logging to a file. The files are rotated daily. |
metrics | boolean | Enables metrics on the number of log messages recorded per logger and level. This requires metrics to be configured as well |
console
Part of logger
configuration.
Include the console section to enable logging to a standard output, such as a terminal window.
Parameter | Type | Description |
---|---|---|
level | either DEBUG , INFO , WARNING , ERROR or CRITICAL | Select the verbosity level for console logging. Valid options, in decreasing verbosity levels, are DEBUG , INFO , WARNING , ERROR , and CRITICAL . Default value is INFO . |
file
Part of logger
configuration.
Include the file section to enable logging to a file. The files are rotated daily.
Parameter | Type | Description |
---|---|---|
level | either DEBUG , INFO , WARNING , ERROR or CRITICAL | Select the verbosity level for file logging. Valid options, in decreasing verbosity levels, are DEBUG , INFO , WARNING , ERROR , and CRITICAL . Default value is INFO . |
path | string | Required. Insert the path to the log file. |
retention | integer | Specify the number of days to keep logs for. Default value is 7 . |
metrics
Global parameter.
The metrics
section describes where to send metrics on extractor performance for remote monitoring of the extractor. We recommend sending metrics to a Prometheus pushgateway, but you can also send metrics as time series in the CDF project.
Parameter | Type | Description |
---|---|---|
push-gateways | list | List of prometheus pushgateway configurations |
cognite | object | Push metrics to CDF timeseries. Requires CDF credentials to be configured |
server | object | The extractor can also be configured to expose a HTTP server with prometheus metrics for scraping |
push-gateways
Part of metrics
configuration.
List of prometheus pushgateway configurations
Each element of this list should be a the push-gateways sections contain a list of metric destinations.
Parameter | Type | Description |
---|---|---|
host | string | Enter the address of the host to push metrics to. |
job-name | string | Enter the value of the exported_job label to associate metrics with. This separates several deployments on a single pushgateway, and should be unique. |
username | string | Enter the credentials for the pushgateway. |
password | string | Enter the credentials for the pushgateway. |
clear-after | either null or integer | Enter the number of seconds to wait before clearing the pushgateway. When this parameter is present, the extractor will stall after the run is complete before deleting all metrics from the pushgateway. The recommended value is at least twice that of the scrape interval on the pushgateway. This is to ensure that the last metrics are gathered before the deletion. Default is disabled. |
push-interval | integer | Enter the interval in seconds between each push. Default value is 30 . |
cognite
Part of metrics
configuration.
Push metrics to CDF timeseries. Requires CDF credentials to be configured
Parameter | Type | Description |
---|---|---|
external-id-prefix | string | Required. Prefix on external ID used when creating CDF time series to store metrics. |
asset-name | string | Enter the name for a CDF asset that will have all the metrics time series attached to it. |
asset-external-id | string | Enter the external ID for a CDF asset that will have all the metrics time series attached to it. |
push-interval | integer | Enter the interval in seconds between each push to CDF. Default value is 30 . |
data-set | object | Data set the metrics will be created under |
data-set
Part of cognite
configuration.
Data set the metrics will be created under
Parameter | Type | Description |
---|---|---|
id | integer | Resource internal id |
external-id | string | Resource external id |
server
Part of metrics
configuration.
The extractor can also be configured to expose a HTTP server with prometheus metrics for scraping
Parameter | Type | Description |
---|---|---|
host | string | Host to run the prometheus server on. Default value is 0.0.0.0 . |
port | integer | Local port to expose the prometheus server on. Default value is 9000 . |
queries
Global parameter.
List of queries to execute
Each element of this list should be a description of a SQL query against a database
Parameter | Type | Description |
---|---|---|
database | string | Required. Enter the name of the database to connect to. This must be one of the database names configured in the databases section. |
name | string | Required. Enter a name of this query that will be used for logging and tagging metrics. The name must be unique for each query in the configuration file. |
query | string | Required. SQL query to execute. Supports interpolation with {incremental_field} and {start_at} |
destination | configuration for either RAW, Events, Assets, Time series, Sequence or Files | Required. The destination of the data in CDF. Examples: {'destination': {'type': 'raw', 'database': 'my-database', 'table': 'my-table'}} {'destination': {'type': 'events'}} |
primary-key | string | Insert the format of the row key in CDF RAW. This parameter supports case-sensitive substitutions with values from the table columns. For example, if there's a column called index, setting primary-key: row_{index} will result in rows with keys row_0 , row_1 , etc. This is a required value if the destination is a raw type.Example: row_{index} |
incremental-field | string | Insert the table column that holds the incremental field. Include to enable incremental loading, otherwise the extractor will default to a full run every time. To use incremental load, a state store is required |
freshness-field | string | Which column to use for freshness metric. Must be specified along with freshness-field-timezone |
freshness-field-timezone | string | Timezone to use for freshness metric |
initial-start | either string, number or integer | Enter the {start_at} for the first run. Later runs will use the value stored in the state store. Will only be used on the initial run, subsequent runs will use the stored state. Required when incremental-field is set. |
schedule | configuration for either Fixed interval or CRON expression | Enter the schedule for when this query should run. Make sure not to schedule runs too often, but leave some room for the previous execution to be done. Required when running in continuous mode, ignored otherwise. Examples: {'schedule': {'type': 'interval', 'expression': '1h'}} {'schedule': {'type': 'cron', 'expression': '0 7-17 * * 1-5'}} |
collection | string | Specify the collection on which the query will be executed. This parameter is mandatory when connecting to mongodb databases. |
container | string | Specify the container on which the query will be executed. This parameter is mandatory when connecting to cosmosdb databases. |
sheet | string | Specify the sheet on which the query will be executed. This parameter is mandatory when connecting to spreadsheet files. |
skip_rows | string | Specify the number of rows to be skipped when reading a spreadsheet. This parameter is optional when connecting to spreadsheet files. |
has_header | string | Specify if the extractor should skip the file header while reading a spreadsheet. This parameter is optional when connecting to spreadsheet files. |
parameters | string | Specify the parameters to be used when querying to AWS DynamoDB. This parameter is mandatory when connectong to dynamodb databases. |
destination
Part of queries
configuration.
The destination of the data in CDF.
Either one of the following options:
Examples:
destination:
type: raw
database: my-database
table: my-table
destination:
type: events
raw
Part of destination
configuration.
The raw destination writes data to the CDF staging area (RAW). The raw destination requires the primary-key
parameter in the query configuration.
Parameter | Type | Description |
---|---|---|
type | always raw | Type of CDF destination, set to raw to write data to RAW. |
database | string | Required. Enter the CDF RAW database to upload data into. This will be created if it doesn't exist. |
table | string | Required. Enter the CDF RAW table to upload data into. This will be created if it doesn't exist. |