Skip to main content

Assign capabilities

To control access to data and features in Cognite Data Fusion (CDF), you define what capabilities users or applications have to work with different resource types in CDF, for example, if they can read a time series (timeseries:read) or create a 3D model (3D:create).

Capabilities also decide which features you have access to. For example, you need the 3d:create capability to upload 3D models to CDF.

A capability is defined by a resource type, a scope, and actions. The resource type and scope define the data the capability applies to, while the action defines the operations you are allowed to perform.

Groups in CDF

Instead of assigning capabilities to individual users and applications, you use groups in CDF to define which capabilities the group members (users or applications) should have. You link and synchronize the CDF groups to user groups in your identity provider (IdP), for instance, Microsoft Entra ID or Amazon Cognito.

For example, if you want users or applications to read, but not write, time series data in CDF, you first create a group in your IdP to add the relevant users and applications. Next, you create a CDF group with the necessary capabilities (timeseries:read) and link the CDF group and the IdP group.

You can tag sensitive resources with additional security categories for even more fine-grained access control and protection.

This flexibility lets you manage and update your data governance policies quickly and securely. You can continue to manage users and applications in your organization's IdP service outside of CDF.

This article explains how to add capabilities to groups, create and assign security categories. You will also find overviews of the necessary capabilities to access and use different features in CDF.

info

For users to successfully sign in and use Fusion UI, the following minimum set of capabilities is required : projects:list, groups:list and groups:read

Create a group and add capabilities

  1. Navigate to Admin > Groups > Create group.

  2. Enter a unique name for the group.

  3. Select Add capability.

    1. In the Capability type field, select a resource type, such as assets and time series, CDF groups, data sets, or specific functionality.

    2. In the Action field, allow for actions on the data, such as read, write or list.

    3. In the Scope field, scope the access to all data or a subset within the selected capability type. The subset differs according to the capability type but always includes all data as an option.

  4. Select Save.

  5. In the Source ID field, enter the Object Id (Entra ID) or Group name (Cognito) exactly as it exists in your identity provider (IdP). It will link the CDF group to a group in Microsoft Entra ID or Amazon Cognito.

Create new group with link to Microsoft Entra ID group object ID

Create and assign security categories

You can add an extra access level for time series and files by tagging resources with security categories via the Cognite API. This is useful if you want to protect market-sensitive data. To access resources tagged with a security category, you must have both the standard capabilities for the resource type and capabilities for the security category.

To access, create, update, and delete security categories, you need these capabilities via a group membership:

  • securitycategories:create
  • securitycategories:update
  • securitycategories:delete

To assign security categories to groups:

  1. Open the group where you want to add security categories.
  2. In the Capability type field, select Security categories.
  3. In the Action field, select securitycategories:memberof.
  4. In the Scope field, select Security categories, associate a security category or select All.

To perform actions, such as read or write on time series and files tagged with capabilities and security categories:

  • You must be a member of a group with actions that give access to a times series or files, for instance, timeseries:read.
  • You must be a member of a group with the securitycategories:memberof capability for the same time series or files.

Share data and mention coworkers

User profiles let users share data and mention (@name) coworkers. By default, CDF will collect user information, such as name, email, and job title.

All users with any group membership in a CDF project get the userProfilesAcl:READ capability and can search for other users.

Feature capabilities

The tables below describe the necessary capabilities to access different CDF features.

note

In addition to the capabilities listed in the sections below, users and applications need these minimum capabilities to access any feature in CDF.

Capability typeActionScopeDescription
Groupsgroups:listCurrent user, AllVerifies user group access.
Projectsprojects:listAllVerifies that a user or application has access to a CDF project. To access the resources in the project, see the capabilities listed below.

Extractors

PI extractor

Extract time series data from the OSISoft PI Data Archive.

Capability typeActionScopeDescription
Timeseriestimeseries:read, timeseries:writeData set, AllIngest time series
RAWraw:read, raw:write, raw:listTables, AllIngest to CDF RAW and for state store configured to use CDF RAW.
Eventsevents:read, events:writeData sets, AllLog extractor incidents as events in CDF.
Extraction pipeline runsextractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.
Remote configuration filesextractionconfigs:writeData sets, Extraction pipelines, AllUse versioned extractor configuration files stored in the cloud.

PI AF extractor

Extract data from the OSIsoft PI Asset Framework (PI AF).

Capability typeActionScopeDescription
RAWraw:read, raw:write, raw:listTables, AllIngest to CDF RAW and for state store configured to use CDF RAW.
Extraction pipeline runsextractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.

PI Replace utility

Re-ingest time series to CDF by optionally deleting a data point time range and ingesting the data points in PI for that time range.

Capability typeActionScopeDescription
Time seriestimeseries:read, timeseries:writeData sets, AllRe-ingest time series into CDF.
RAWRAW:read, RAW:write, RAW:listTables, AllIngest to CDF RAW and for state store configured to use CDF RAW.
Eventsevents:read, events:writeData sets, AllLog extractor incidents as events in CDF.

DB extractor

Extract data from any database supporting Open Database Connectivity (ODBC) drivers.

Capability typeActionScopeDescription
RAWRAW:read, RAW:write, RAW:listTables, AllIngest to CDF RAW and for state store configured to use CDF RAW.
Extraction pipeline runsextractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.
Remote configuration filesextractionconfigs:writeData sets, Extraction pipelines, AllUse versioned extractor configuration files stored in the cloud.

OPC UA extractor

Extract time series, events, and asset data via the OPC UA protocol.

Capability typeActionScopeDescription
Time seriestimeseries:read, timeseries:writeData sets, AllIngest time series.
Assetsassets:read, assets:writeData sets, AllUse if the configuration parameters raw-metadata or skip-metadata aren't set.
Eventsevents:read, events:writeData sets, AllIngest events if enabled.
RAWRAW:read, RAW:write, RAW:listTables, AllIngest metadata to CDF RAW or the state-store is set to use CDF RAW.
Relationshipsrelationships:read, relationships:writeData sets, AllIngest relationships if enabled in the configuration.
Data setsdata-sets:readData sets, AllIngest the data set external ID if enabled in the configuration.
Extraction pipeline runsextractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.
Remote configuration filesextractionconfigs:writeData sets, Extraction pipelines, AllUse versioned extractor configuration files stored in the cloud.

Studio for Petrel extractor

Connect to SLB Studio for Petrel through the Ocean SDK and stream Petrel object data to the CDF files service as protobuf objects.

Capability typeActionScopeDescription
Filesfiles:read, files:write, files:listData sets, AllIngest SLB Studio for Petrel object data into CDF.

WITSML extractor

Connect via the Simple Object Access Protocol (SOAP) and the Energistics Transfer Protocol (ETP) and extract data using the Wellsite Information Transfer Standard Markup Language (WITSML) into CDF.

Capability typeActionScopeDescription
Time seriestimeseries:read, timeseries:write, timeseries:listData sets, AllIngest WITSML growing objects, such as WITSML logs, into the CDF time series services.
Sequencessequences:read, sequences:write, sequences:listData sets, AllIngest WITSML growing objects, such as WITSML logs, into the CDF sequences services.
RAWraw:read, raw:write, and raw:listTables, AllIngest WITSML non-growing objects, such as wellbore, into CDF RAW.
Extraction pipelinesextractionpipelines:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.

EDM extractor

Connect to the Landmark Engineers Data Model server and extract data through the Open Data protocol (OData) from DecisionSpace Integration Server (DSIS) to CDF RAW.

Capability typeActionScopeDescription
RAWraw:read, raw:write, raw:listTables, AllIngest data from Landmark EDM model into CDF RAW.

OSDU extractor

Connect to the Open Group OSDU™ Data Platform and ingest data into CDF.

Capability typeActionScopeDescription
RAWraw:read, raw:write, raw:listTables, AllIngest configurable OSDU records, such as wells, wellbores, and well logs.
Filesfiles:read, files:writeData sets, AllIngest linked data files.
Extraction pipelinesextractionpipelines:read, extractionpipelines:writeData sets, Extraction pipelines, AllCreate and edit extraction pipelines.
Extraction pipeline runsextractionruns:read, extractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.
Remote configuration filesextractionconfigs:writeData sets, Extraction pipelines, AllUse versioned extractor configuration files stored in the cloud.

File extractor

Connect to local file systems, SharePoint Online Document libraries, and network sharing protocols, like FTP, FTPS, and SFTP, and extract files into CDF.

Capability typeActionScopeDescription
Filesfiles:read, files:writeData sets, AllIngest files.
RAWraw:read, raw:write, raw:listTables, AllFor state store configured to use CDF RAW.

Documentum extractor

Extract documents from OpenText Documentum or OpenText D2 systems.

Capability typeActionScopeDescription
Filesfiles:read, files:writeData sets, AllIngest files.
RAWraw:read, raw:write, raw:listTables, AllIngest metadata to CDF RAW.
Extraction pipeline runsextractionruns:writeData sets, Extraction pipelines, AllAllow the extractor to report state and heartbeat back to CDF.
Remote configuration filesextractionconfigs:writeData sets, Extraction pipelines, AllUse versioned extractor configuration files stored in the cloud.

Simulator connectors

Integrate existing simulators with CDF to remotely run simulations.

Capability typeActionScopeDescription
Simulatorssimulators:manageData sets, AllConnect to the simulators API.
Time seriestimeseries:read, timeseries:writeData sets, AllUse CDF time series as source/destination for simulation data.
Extraction pipelinesextractionpipelines:read, extractionpipelines:writeData sets, AllStore the connector configuration remotely.
Filesfiles:readData sets, AllDownload simulation model files.
Data setsdatasets:readData sets, AllView data sets.

Simulators users

Set access to the user interface for integrating existing simulators with CDF to remotely run simulations.

Capability typeActionScopeDescription
Simulatorssimulators:read, simulators:write, simulators:delete , simulators:runData sets, AllConnect to the simulators API.
Time seriestimeseries:read,Data sets, AllUse CDF time series as source/destination for simulation data.
Filesfiles:read, files:writeData sets, AllUpload and download simulation model files.
Data setsdatasets:readData sets, AllView data sets.

Hosted extractors

Extract data from source systems using Cognite hosted extractors.

Capability typeActionScopeDescription
Hosted extractorshostedextractors:read, hostedextractors:writeAllMonitor and manage hosted extractors.
Time seriestimeseries:read, timeseries:writeData sets, AllUse CDF time series as destination.
Eventsevents:read, events:writeData sets, AllUse CDF events as destination.
RAWraw:read, raw:write, raw:listData sets, AllUse CDF RAW as destination.
Assetsassets:readData sets, AllContextualize time series or events with assets.

PostgreSQL gateway

Ingest data into CDF using the Cognite PostgreSQL gateway.

Capability typeActionScopeDescription
Resource typesread, writeData sets, AllAdd read and write capabilities for the CDF resources you want to ingest data into. For instance, to ingest assets, add assets:read and assets:write.
Note

If you revoke the capabilities in the CDF group, you also revoke access for the PostgreSQL gateway.

Manage staged data

Work with tables and databases in CDF RAW.

Capability typeActionScopeDescription
RAWRAW:read, RAW:listTables, AllView tables and databases in CDF RAW.
RAWRAW:writeTables, AllCreate, update, delete tables and databases in CDF RAW.

Transform data

Transform data from RAW tables into the CDF data model.

Capability typeActionScopeDescription
Resource typesread and write actions according to the CDF resources you want to read from and write to using transformations.Data sets, AllFor instance, to transform data in CDF RAW and write data to assets, add RAW:read and assets:write.
Transformationstransformations:readAllView transformations.
Transformationstransformations:writeAllCreate, update, delete CDF transformations.
Sessionsessions:createAllRun scheduled transformations. You must also set the token_url. Read more here.
Legacy access to transformations via group name

To ensure backward compatibility, groups named transformations or jetfire are treated as having both transformations:read:All and transformations:write:All. We're deprecating this access control method and will remove it in a future release.

Upload 3D models

Upload and work with 3D models, 3D revisions and 3D files.

Capability typeActionScopeDescription
3D3d:read, file:readData sets, AllView 3D models.
3D3d:create, file:readData sets, AllUpload 3D models to CDF.
3D3d:updateData sets, AllUpdate existing 3D models in CDF.
3D3d:deleteData sets, AllDelete 3D models.

Extraction pipelines

Set up and monitor extraction pipelines and report the pipeline run history.

UserActionCapabilityDescription
End-userCreate and edit extraction pipelinesextractionpipelines:writeGives access to create and edit individual pipelines and edit notification settings. Ensure that the pipeline has read access to the data set being used by the extraction pipeline.
View extraction pipelinesextractionpipelines:readGives access to list and view pipeline metadata.
Create and edit extraction configurationsextractionconfigs:writeGives access to create and edit an extractor configuration in an extraction pipeline.
View extraction configurationsextractionconfigs:readGives access to view an extractor configuration in an extraction pipeline.
View extraction logsextractionruns:readGives access to view run history reported by the extraction pipeline runs.
ExtractorRead extraction configurationsextractionconfigs:readGives access to read an extractor configuration from an extraction pipeline.
Post extraction logsextractionruns:readGives access to post run history reported by the extraction pipeline runs.
Third-party actorsCreate and edit extraction pipelinesextractionpipelines:writeGives access to create and edit individual pipelines and edit notification settings. Ensure that the pipeline has read access to the data set being used by the extraction pipeline.
Create and edit extraction configurationsextractionconfigs:writeGives access to create and edit the extractor configuration from an extraction pipeline.

Match entities

Create and tune models to automatically contextualize resources.

Capability typeActionScopeDescription
Entity matchingentitymatchingAcl:readAllList and view entity matching models.
Entity matchingentitymatchingAcl:writeAllCreate, update, delete entity matching models.
Assetsassets:readData sets, AllMatch entities to assets.

Interactive engineering diagrams

Find, extract, and match tags on engineering diagrams and link them to an asset hierarchy or other resource types.

Capability typeActionScopeDescription
Filesfiles:read, files:writeData sets, AllList and extract tags from engineering diagrams.
Assetsassets:read, assets:writeData sets, AllAdd tags to assets.
Eventsevents:read, events:writeData sets, AllView and create annotations manually or automatically in the engineering diagrams.
Labelslabels:read, labels:writeData sets, AllView, approve, and reject tags in the engineering diagrams.

Diagram parsing

Analyze and detect symbols and tags on engineering diagrams and link them to an asset hierarchy or other resource type in CDF.

Capability typeActionScopeDescription
Diagram parsingdiagramParsing:READNo scopeSee the results of completed parsing jobs and read the outputs from parsed files.
Data model instancesdataModelInstances:READSpace IDs, AllSee the results of completed parsing jobs and read the outputs from parsed files.
Diagram parsingdiagramParsing:READ, diagramParsing:WRITENo scopeStart a diagram parsing job.
Data model instancesdataModelInstances:READ, dataModelInstances:WRITESpace IDs, AllStart a diagram parsing job.
Sessionssessions:createAllStart a diagram parsing job.

Document parser

Extract data from documents, such as datasheets, equipment specifications, or process flow diagrams.

Capability typeActionScopeDescription
Data modelsdatamodels:readSpace IDs, AllStart a document parsing job or see the results of completed parsing jobs.
Data modelsdatamodels:writeSpace IDs, AllApprove the results of completed parsing jobs. The data are saved in a data model instance.
Filesfiles:readData sets, AllStart a document parsing job or see the results of completed parsing jobs.

Explore data and Image and video data

Find, validate, and learn about the data you need to build solutions in the Data explorer and Image and video management.

Capability typeActionScopeDescription
Resource typesfiles:readData sets, AllAll resource types used in the CDF project.
Annotationsannotations:writeAllCreate new/edit existing annotations.

Data modeling

Create data models and ingest data into them.

Capability typeActionScopeDescription
Data modelsdatamodels:readAllView data models.
Data modelsdatamodels:writeAllCreate, update, delete data models.
Data model instancesdatamodelinstances:readAllView data in data models.
Data model instancesdatamodelinstances:writeAllCreate, update, delete data model instances.

Data workflows

Use data workflows to coordinate interdependent processes.

Capability typeActionScopeDescription
Data workflowsworkfloworchestration:readData sets, AllView data workflows, versions, executions, tasks, and triggers.
Data workflowsworkfloworchestration:writeData sets, AllCreate, update, delete, and run data workflows, versions, executions, tasks, and triggers.

Functions

Deploy Python code to CDF and call the code on-demand or schedule the code to run at regular intervals.

Capability typeActionScopeDescription
Functionsfunctions:writeAllCreate, call and schedule functions.
Functionsfunctions:readAllRetrieve and list functions, retrieve function responses and function logs.
Filesfiles:readAllView functions.
Filesfiles:writeAllCreate functions.
Sessionssessions:createAllCall and schedule functions.

Streamlit apps

Build custom web applications in CDF.

Capability typeActionScopeDescription
Filesfiles:read, files:writeData sets, AllEdit Streamlit applications.
Filesfiles:readData sets, AllUse Streamlit applications.

Canvas

Add assets, engineering diagrams, sensor data, images, and 3D models from your CDF project to a canvas.

The first time you set up a canvas, you must:

  • Grant datamodels:read with the scope set to All for admin users.

  • Create IndustrialCanvasInstanceSpace and CommentInstanceSpace spaces. If you don't add create these spaces, CDF will prompt you to add these when you open a canvas.

Capability typeActionScopeDescription
Data modelsdatamodels:readcdf_industrial_canvas, cdf_apps_shared, IndustrialCanvasInstanceSpace, CommentInstanceSpaceView data from data models within IndustrialCanvasInstanceSpace
and CommentInstanceSpace.
Data model instancesdatamodelinstances:readcdf_industrial_canvas, cdf_apps_shared, IndustrialCanvasInstanceSpace, CommentInstanceSpaceView canvases.
Data model instancesdatamodelinstances:writecdf_industrial_canvas, cdf_apps_shared, IndustrialCanvasInstanceSpace, CommentInstanceSpaceCreate, update, delete canvas data.
Resource typesread actions for the resource types you want to add to canvasData sets, AllView asset, events, files, time series, and 3D models
FileswriteData sets, AllAdd local files on a canvas. This is an optional capability.
Mention coworkers and add comments

You must set up the CommentInstanceSpace space and enable user profiles to comment or mention coworkers on a canvas.

Private canvases

Private canvases aren't governed by CDF access management.

Charts

Capability typeActionScopeDescription
Assetsassets:readData sets, AllSearch for assets.
Time seriestimeseries:readData sets, AllSearch for time series.
Time seriestimeseries:writeData sets,Needed to schedule calculations.
Filesfiles:readData sets, AllSearch for files.
Groupsgroups:listCurrent user, AllUse calculations in Charts.
Projectsprojects:listAllUse calculations in Charts.
SessionsSessions:listAllNeeded for monitoring and to schedule calculations.
SessionsSessions:createAllNeeded for monitoring and to schedule calculations.
SessionsSessions:deleteAllNeeded for monitoring and to schedule calculations.

Data sets and Data catalog

Use the Data sets capability type to grant users and applications access to add or edit metadata for data sets.

To add or edit data within a data set, use the relevant resource type capability. For instance, to write time series to a data set, use the Time series capability type. Read more here.

Capability typeActionScopeDescription
Data setsdatasets:readData sets, AllView data sets.
Data setsdatasets:writeData sets, AllCreate or edit data sets.

Configure location filters

Configure location filters to help users find and select data related to a specific physical location.

Capability typeActionScopeDescription
Location-filterslocationfilters:readLocation-filters, AllView location filters.
Location-filterslocationfilters:writeLocation-filters, AllConfigure, edit, and delete location filters.

Configure InField

Set up the InField application.

Capability typeActionScopeDescription
Assetsassets:readData sets, AllView asset data from the CDF project that InField runs on top of.
Groupsgroups:readCurrent user, AllFor InField administrators to grant access to users.
3D3d:readData sets, AllUpload 3D models to be displayed in InField.
Filesfiles:writeData setsAllow users to upload images.
Time seriestimeseries:writeData sets, Time series, Root assets, AllAllow users to upload measurement readings.

Add the InField admin users to an access group named applications-configuration. Learn more

Configure InRobot

Configure access for users

Set up the InRobot application.

Capability typeActionScopeDescription
Assetsassets:readData sets, AllFind asset tags of equipment the robot works with and view asset data.
Data modelsdatamodels:readSpace IDs: APM_Config, cdf_core, cdf_apm, and cdf_apps_sharedView data models.
Data models instancesdatamodelinstances:readSpace IDs: cognite_app_data, cdf_apm, <yourRootLocation_source_data>1, <yourRootLocation_app_data>2View data in data models.
Eventsevents:readData sets, AllView events in the canvas.
Filesfiles:read, files:writeData sets, AllAllow users to upload images.
Groupsgroups:read, groups:create, groups:updateData sets, AllFor InRobot administrators to grant access to users.
3D3d:readData sets, AllUpload 3D models of maps.
Projectsprojects:read, projects:listData sets, AllExtract the projects the user has access to.
Roboticsrobotics:read, robotics:create, robotics:update, robotics:deleteData sets, AllControl robots and access robotics data.
Time seriestimeseries:readData sets, AllAllow users to view measurement readings as time series.

Configure access for robots

Set up a robot's access for the InRobot application.

Capability typeActionScopeDescription
Assetsassets:readData sets, AllFor the robot to find asset tags of equipment so that the data collected can be connected to the corresponding asset.
Data modelsdatamodels:read, datamodels:writeSelect Space IDs: APM_Config, cdf_core, cdf_apm, and cdf_apps_shared.
Otherwise, All.
For the robot to write the robotics data to the APM data model.
Data model instancesdatamodelinstances:readSelect Space IDs: APM_Config.
Otherwise, All.
For the robot to write the robotics data to the APM data model.
Data model instancesdatamodelinstances:read, datamodelinstances:writeSelect Space IDs: cognite_app_data, cdf_apm, <yourRootLocation_source_data>1, <yourRootLocation_app_data>2.
Otherwise, All.
For the robot to write the robotics data to the APM data model.
Filesfiles:readAllFor the robot to download robot-specific files, such as maps; read or access uploaded robot data.
Filesfiles:writeRobot dataSetIDFor the robot to upload the data collected by the robot to CDF.
Labelslabels:read, labels:writeAllFor the robot to label data according to how the data should be processed in CDF.
Roboticsrobotics:read, robotics:create, robotics:update, robotics:deleteData sets (robot's data set ID)For the robot to access robotics data.

Footnotes

  1. <yourRootLocation_source_data> space holds space data coming from a customer source system for a particular location. Can be used for InRobot and InField. 2

  2. <yourRootLocation_app_data> space holds data coming from InRobot for a particular location. Can be used for InRobot and InField. 2