cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Workflow limits

henk_stobbe
DynaMight Leader
DynaMight Leader

Good morning,

Does anybody know what the constraints are for workflows, or in other words what would happen if I trigger  a workflow many times in a very short time? 

I assume this is resolved by using a queue with the full is full principle and removing the ones that are waiting to long?

So in short how many instances of my workflows run concurrently?

KR Henk

 

4 REPLIES 4

p_devulapalli
Champion

@henk_stobbe - Doesn't look like there is a hard limit as per docs , its mostly to with system capacity 

A workflow execution is the instantiation of a workflow. There can be as many executions of a workflow running at any given time as requested (within system capacity). A workflow execution is triggered manually via Workflows, automatically via an API call, a trigger event, or according to schedule.

https://docs.dynatrace.com/docs/shortlink/workflows-running#workflow-execution

 

Phani Devulapalli

ChristopherHejl
Dynatrace Advisor
Dynatrace Advisor

Hi Henk,

once a workflow is triggered it will end up in a queue, correct. However if the triggering is successful, it will not be removed from the queue but will get executed once it reaches the front of the queue.
Queue wait times however tend to be very low, max a few seconds even under high load.

Workflows do have a max execution limit per hour, which is currently only enforced for event triggers but soon for the whole workflow instead of trigger type specific (both as a safeguard for us and our customer in case of some misconfig).
This limit is 1k executions per workflow per hour. Once you reach that limit the workflow is throttled and when trying to trigger the workflow, you will get a 429 error immediatly.

zero_ho
Dynatrace Participant
Dynatrace Participant

Hi @ChristopherHejl

May I ask if there is any tenant-wide workflow execution limit?

Say can I have total 10,000 (triggered from say 50 different workflows) executions per hour and in each execution there will be JavaScript code to pull data from external sources and ingest the massaged data to the tenant (as metrics) ?

ChristopherHejl
Dynatrace Advisor
Dynatrace Advisor

There are no execution limits outside of the above mentiond 1k / workflow / hour.
yes, you can easily run 10k exeuction in an hour spread across multiple workflows.

I would advise against using workflows as a data poller for data ingest / metric ingest.
You will quickly run into limitations, sync issues etc.

You can use workflows to pull data in a limited, non growing capacity, eg pulling jira statistics once a day and the like. Syncing data or polling data which can grow will usually lead to issue quickly, as the automation engine is a process orchestrator and not a data pipelinen.
Ideally you'd use the data ingest tools  if available, like the agents, activevates, extensions, metrics extraction in the pipeline (eg bizevents or logs), etc....
Also if you want to sync some data from a remote system that we don't support yet, ideally the remote system has some sort of data push method (eg via webhooks) to push data directly to the APIs or data pipeline. Note: you can use long living platform tokens for these processes when OAuth authentication is not a option.

Best,

Christopher

Featured Posts