# Use Azure Data Factory with PostgreSQL gateway

Ingest data with your existing ETL solution into CDF staging area, Cognite RAW, or into the CDF data model with the PostgreSQL gateway.

NOTE

Cognite supports ingestion using Microsoft Azure Data Factory (ADF).

ETL data flow

This article shows how to create a simple data pipeline in Azure Data Factory to copy data from a file in Azure blob storage and ingest it to CDF. Based on your data source and your data processing requirements you may need to create more advanced data pipelines in Azure data factory (opens new window).

In this article:

# Before you start

  1. Make sure you have an Azure subscription and have created Azure resource groups (opens new window).

  2. Make sure you have created an Azure storage account (opens new window).

  3. Make sure you have the file containing the data to be ingested to CDF in Azure blob storage.

# Step 1: Create a data factory

  1. Sign in to the Azure portal (opens new window) as an admin.

  2. If you have access to multiple tenants, use the Directory + subscription filter Directory + subscription filter in the top menu to select the tenant in which you want to register a data factory.

  3. Search for and select Data Factory.

  4. Click Create to open the Data Factory page.

Create Azure Data Factory

  1. On the Basics tab:

    • Subscription: Select the Azure subscription in which you want to create the data factory.
    • Resource group: Create or select the Azure resource group you want to use.
    • Region: Select a location where your ADF metadata will be stored. This list only shows locations that ADF supports.
    • Name: Enter a globally unique name for the data factory.
    • Version: Select V2.
  2. Click Next: Git configuration and fill in the required fields or select Configure Git later.

  3. Click Review + Create > Create.

  4. After the creation is complete, open the data factory and select the Author & Monitor tile to start the Azure Data Factory application in a separate tab.

# Step 2: Create an ADF pipeline and set the source system

Use the Copy data tool to create a pipeline that reads data from a file in your data storage and writes to CDF.

  1. On the Let's get started page, select the Create pipeline tile.

  2. Open Activities > Move and transform.

  3. Drag and drop the Copy data onto the canvas and enter a unique name in the Name field.

  4. Open the Source tab > New.

  5. Select the datastore that references the source system you're extracting from. In this example we'll extract data from a .csv file in Azure Blob Storage.

  6. Click Continue.

Azure Copy Data

  1. Create a new linked service that links your data store to the data factory.

    • Enter a unique name for the service.

    • Select your Azure subscription and storage account.

    • Click Test connection to make sure you are connected to the selected storage account.

    • Click Create.

    Azure Set Source

  2. Under Set properties, set the file you want to read data from.

    • Enter a unique name for the source data set.

    • Browse to the file you want to use.

    • Select First row as header to reuse the source file's header columns.

    • Click OK.

# Step 3: Set PostgreSQL gateway as sink

In this step, add the PostgreSQL gateway as the destination using a sink in ADF.

  1. Open Copy data > Sink tab.

  2. In the Sink dataset field, select New.

  3. Select or search for Azure Database for PostgreSQL.

  4. Click Continue.

  5. Click Open to open the sink data set.

Open linked service for sink

  1. On the Connection tab, add a linked service for the sink data set.

    • Click New to add a linked service.

    • Under Account selection method > Enter manually.

    • In the Fully qualified domain name field, enter skywarp.<cluster>.cognite.ai where <cluster> is where your CDF instance is installed. If you don't know the cluster name, contact Cognite support.

    • In the Port field, enter 5432, which is the default port for the PostgreSQL gateway.

    • For Database name and User name, enter the username returned from the PostgreSQL gateway.

    • In the Password field, enter the password returned from the PostgreSQL gateway.

    • In the Encryption method field, select SSL.

    • Optionally, select Validate server certificate.

    • Click Test connection to make sure you are connected to the Azure database for PostgreSQL.

  2. Click Create.

# Step 4: Set the destination database table

Back on the Connection tab, you need to select the destination table from the tables available in your PostgreSQL gateway database. You can choose between several resource types (opens new window) like assets, events, and data points.

  1. In the Table field, select the table matching the destination RAW Table or the CDF resource type public.<cdf_resource_type>, for instance, public.assets or public.timeseries.

ADF public tables

  1. Navigate back to the Sink tab on your pipeline.

    • Under Write method, select Bulk insert.

# Step 5: Map the source and destination tables

  1. Open the Mapping tab on your pipeline.

  2. Click Import schemas to import the table columns and remove the lines you won't use.

  3. Map the columns according to your requirements.

  4. Click Debug in the top bar to run the PostgreSQL gateway.

ADF mapping tables

# Step 6: Verify your data

Verify that the configuration is successful.

  1. Sign in to Cognite Data Fusion (opens new window)
  2. In the top menu, select Explore & Build or Browse staged data and check that the data has been successfully ingested.
Last Updated: 7/30/2021, 11:43:44 AM