Triggers for data workflows
Triggers for data workflows
Triggers automate the execution of your data workflows.
The Workflow triggers specification has more information about the API.
To control the access to triggers, use the access capabilities workfloworchestration:read
and workfloworchestration:write
.
A data workflow can have one or more triggers associated with it. Each trigger is uniquely identified by an externalId
.
Trigger rule
When you create a trigger, specify a triggerRule
that defines the conditions that must be met to run the trigger. The triggerRule
consists of a trigger type and its associated parameters.
Schedule
Use the schedule
trigger type to run a data workflow at regular intervals. The interval is specified by a cron expression.
For example, to run the trigger every day at 12:00 AM, use the cron expression "0 0 * * *"
. The time zone for the cron expression is UTC.
Data modeling events
This feature is currently in beta testing and is subject to change.
Use the dataModeling
trigger type to run a data workflow based on changes to data modeling instances matching a filter.
Trigger types in the API specification has more information.
Specify the filter using the trigger's dataModelingQuery
parameter. The query runs at regular intervals to poll for changes. Data modeling queries has more information.
You can control the batching of changes using the following parameters:
-
batchSize
: The maximum number of items to pass as input to a workflow execution. -
batchTimeout
: The maximum time in seconds to wait for the batch to complete before passing it to a workflow execution.
After the batchTimeout
has passed, a partial batch will be passed to the workflow execution. A complete batch will be passed to the workflow execution without further delay.
Example
This example shows how to create a trigger using the Python SDK. The trigger starts a workflow based on changes to instances of ExampleView
with the property site
equal to "Site A." According to the batch configurations, the trigger starts the workflow when it reaches 100 instance changes or 60 seconds after the first change.
# Specify the relevant data model view
example_view_id = ViewId(space="ExampleSpace", external_id="ExampleView", version="1")
# Define the trigger rule including the data modeling query
trigger_rule = WorkflowDataModelingTriggerRule(
data_modeling_query=WorkflowTriggerDataModelingQuery(
# Query instances of ExampleView with property site="Site A"
with_={
"example_view": NodeResultSetExpression(
filter=Equals(
property=example_view_id.as_property_ref("site"), value="Site A"
)
)
},
# Return properties "site" and "name" from the instances
select={
"example_view": Select(
sources=[
SourceSelector(source=example_view_id, properties=["site", "name"])
]
)
},
),
batch_size=100, # Start the workflow when reaching 100 changes
batch_timeout=60, # Start the workflow when reaching 60 seconds after the first change
)
# Create the trigger
client.workflows.triggers.upsert(
WorkflowTriggerUpsert(
external_id="my-data-modeling-trigger",
trigger_rule=trigger_rule
workflow_external_id="my-workflow-external-id",
workflow_version="v1",
)
)
Trigger target
The trigger targets a data workflow, defined by a workflowExternalId
and workflowVersion
.
Input and authentication
You can define an input
data object as part of the trigger. When the trigger starts executing the target data workflow, this input object will be provided as input to the workflow.
For authentication, the trigger requires a nonce. This is a temporary token that is used for authentication when the data workflow execution starts. A nonce can be retrieved from the Sessions API when creating a session.
Trigger run history
You can retrieve a trigger's run history. This gives you detailed information about each run, such as when it happened, which data workflow it successfully started, or why the trigger run failed.