AWS Step Functions history event limitation

后端 未结 2 1861
无人及你
无人及你 2021-01-13 06:02

I use step functions for a big loop, so far no problem, but the day when my loop exceeded 8000 executions I came across the error \"Maximum execution history size\" which is

2条回答
  •  长情又很酷
    2021-01-13 06:31

    To guarantee the execution of all the steps and their orders, step function stores the history of execution after the completion of each state, this storing is the reason behind the limit on the history execution size.

    Having said that, one way to mitigate this limit is by following @sunnyD answer. However, it has below limitations

    1. the invoker of a step function(if there is one) will not get the execution output of the complete data. Instead, he gets the output of the first execution in a chain of execution.
    2. The limit on the number of execution history size has a high chance of increasing in the future versions so writing logic on this number would require you to modify the code/configuration every time the limit is increased or decreased.

    Another alternate solution is to arrange step function as parent and child step functions. In this arrangement, the parent step function contains a task to loop through the entire set of data and create new execution of child step function for each record or set of records(a number which is will not exceed history execution limit of a child SF) in your data. The second step in parent step function will wait for a period of time before it checks the Cloudwatch metrics for the completion of all child function and exits with the output.

    Few things to keep in mind about this solution are,

    1. The startExecution API will throttle at 500 bucket size with 25 refills every second.
    2. Make sure your wait time in parent SF is sufficient for child SFs to finish its execution otherwise implement a loop to check the completion of child SF.

提交回复
热议问题