Skip to main content
For large point clouds, we don’t recommend uploading through the CDF UI. Those uploads can take a long time and may fail if your computer loses its internet connection or goes into sleep mode.
You don’t need to compress files into a single zip file - specify the folder containing your point cloud files.

Prerequisites

Install Python and the Cognite Python SDK in your environment. For more information, see Get started with Python SDK.

Run the upload script

Use the Python script to upload point clouds from a folder. It uploads the files, creates a 3D model and revision, and starts processing. When you run the script, sign in through the browser window that opens. To rename the model, change the name in the Create the 3D model section of the script.
It’ll take a significant amount of time to complete the uploading process. Don’t interrupt or shut down the computer until the process is completed.
from cognite.client import global_config, ClientConfig, CogniteClient
from cognite.client.data_classes import ThreeDModelRevisionWrite
from cognite.client.credentials import OAuthInteractive
import json

# Note: Input the details for your environment
# ******************************************************

TENANT_ID = '<TENANT_ID>'
CLIENT_ID = '<CLIENT_ID>'
CDF_CLUSTER = '<CDF_CLUSTER>'
COGNITE_PROJECT = '<PROJECT NAME>'
DIRECTORY_NAME = "<Specify the directory containing all point cloud files>"
AUTHORITY_URL = f"https://login.microsoftonline.com/{TENANT_ID}"

# ******************************************************

BASE_URL = f"https://{CDF_CLUSTER}.cognitedata.com"

# Using OAuth in Azure Entra Id
tokenProvider = OAuthInteractive(
  authority_url=AUTHORITY_URL,
  client_id=CLIENT_ID,
  scopes=[f'{BASE_URL}/.default']
)

clientConfig = ClientConfig(
    client_name="upload point cloud script",
    project=COGNITE_PROJECT,
    credentials=tokenProvider,
    base_url=BASE_URL,
    timeout=300,
)

global_config.default_client_config = clientConfig
client = CogniteClient()

# A directory with point cloud files will be uploaded as individual files
print(f"Uploading files from {DIRECTORY_NAME} .... (Uploading is in progress; it can take hours if the files are large or there are many files)")
directory_upload_result = client.files.upload(DIRECTORY_NAME, recursive=True)
print(f"Number of files uploaded: {len(directory_upload_result)}")

# Create and upload file_id list json file
file_ids = [file.id for file in directory_upload_result]
file_content = {"dataType": "PointCloud", "fileIds": file_ids}
file_name = "FileIdsList.json"
print(f"Uploading {file_name} ....")
with open(file_name, 'w') as outfile:
    json.dump(file_content, outfile)

file_upload_result = client.files.upload(file_name)

# Create the 3D model
model_name = "PointCloud Model"
print(f"Creating 3D model {model_name} ....")
model = client.three_d.models.create(name=model_name)

# Create the 3D revision for the model.
# This will start the processing job.
# Note: The processing time depends on the total size of the files.
print(f"Creating revision for model {model_name} ....")
new_revision_config = ThreeDModelRevisionWrite(file_id=file_upload_result.id)
created_revision = client.three_d.revisions.create(model_id=model.id, revision=new_revision_config)

print("Summary:")
print(f"File id     : {file_upload_result.id}")
print(f"File name   : {file_name}")
print(f"Model name  : {model_name}")
print(f"Model id    : {model.id}")
print(f"Revision id : {created_revision.id}")
print("Check the processing status for the 3D model revision in Cognite Data Fusion UI.")
print("Once the processing is completed, you can select the model revision to visualize it.")
When the script finishes, go to CDF > Data management > Configure > 3D and verify that 3D revision processing has started. When processing is complete, select the revision to visualize the model.
Last modified on April 10, 2026