Skip to main content
Jobs represent running hosted extractor processes that move data from sources to destinations. Each job links a source to a destination via a set of mappings. You can create, monitor, update, and delete jobs. The API also provides access to job logs and metrics for monitoring extraction health.

What jobs are

A job is an active extraction process. It connects a source (where data is read from) to a destination (where data is written in CDF) and applies mappings to transform the data in between. Jobs run continuously or on a schedule, depending on configuration.

Job lifecycle

  • Create: Define a job by specifying a source, destination, and mappings.
  • Start: Activate the job to begin extraction.
  • Monitor: Check job status, view logs, and inspect metrics.
  • Update: Modify configuration or pause the job as needed.
  • Delete: Stop and remove the job when extraction is no longer required.
Use job logs and metrics to detect failures, diagnose connectivity issues, and ensure data is flowing correctly.

Monitoring extraction

The API provides endpoints to retrieve job logs and metrics. Use these to verify that data is being extracted on schedule, identify errors, and troubleshoot problems with sources or destinations.
Last modified on April 23, 2026