curl --request POST \
--url https://{cluster}.cognitedata.com/api/v1/projects/{project}/streams/{streamId}/records/upsert \
--header 'Authorization: Bearer <token>' \
--header 'Content-Type: application/json' \
--data '
{
"items": [
{
"space": "mySpace",
"externalId": "some_id",
"sources": [
{
"source": {
"type": "container,",
"space": "mySpace,",
"externalId": "myContainer"
},
"properties": {
"someStringProperty": "someStringValue",
"someDirectRelation": {
"space": "mySpace",
"externalId": "someNode"
},
"someIntArrayProperty": [
1,
2,
3,
4
]
}
}
]
}
]
}
'{}Required capabilities:
StreamRecordsAcl:WRITEDataModelsAcl:READ
Note: This endpoint is only available for mutable streams.
Before a user can upsert records, all referenced schema elements (Spaces, Containers, Properties) must be defined/created in Data Modeling.
To upsert records, the user must have capabilities to access (read) the referenced containers as well as capabilities to access (write) the referenced record space.
Batches of records are ingested or updated into a stream.
Under normal upsert operation, the record will be created or updated, and the properties defined in each of the containers in the sources array will be populated.
If record identifiers (space + externalId) are already in the stream, records are fully updated (no partial updates for this endpoint).
Otherwise, records are created.
Note: The maximum total request size is 10MB.
curl --request POST \
--url https://{cluster}.cognitedata.com/api/v1/projects/{project}/streams/{streamId}/records/upsert \
--header 'Authorization: Bearer <token>' \
--header 'Content-Type: application/json' \
--data '
{
"items": [
{
"space": "mySpace",
"externalId": "some_id",
"sources": [
{
"source": {
"type": "container,",
"space": "mySpace,",
"externalId": "myContainer"
},
"properties": {
"someStringProperty": "someStringValue",
"someDirectRelation": {
"space": "mySpace",
"externalId": "someNode"
},
"someIntArrayProperty": [
1,
2,
3,
4
]
}
}
]
}
]
}
'{}Access token issued by the CDF project's configured identity provider. Access token must be an OpenID Connect token, and the project must be configured to accept OpenID Connect tokens. Use a header key of 'Authorization' with a value of 'Bearer $accesstoken'. The token can be obtained through any flow supported by the identity provider.
An identifier of the stream where the records are stored.
1 - 100^[a-z]([a-z0-9_-]{0,98}[a-z0-9])?$"test1"
Records to upsert into a stream.
List of records to write.
1 - 1000 elementsShow child attributes
An empty response is returned if all records from the request are successfully accepted to be created or updated.
Empty JSON object indicates all records are successfully accepted.
Was this page helpful?