From Uber's Cadence doc, it says that a single workflow cannot expect to run more than 100k activities over the lifecycle of the workflow. https://cadenceworkflow.io/docs/concepts/workflows/#child-workflow
A single workflow has a limited size. For example, it cannot execute 100k activities. Child workflows can be used to partition the problem into smaller chunks. One parent with 1000 children each executing 1000 activities is 1 million executed activities.
I was wondering if cadence imposes somewhat similar limits to # of state changes due to many signals received?
In my use case, I have a long running (months to years long) workflows that each workflow will track user's activity for game system. Total number of signals that each user workflow will receive can go over 100k every few days, which leads to more than 100k workflow state changes.
The thing is that each signal won't necessarily invoke activity calls in my business logic, so the total number of activities invoked per workflow can stay low. (e.g. 100 activity calls total while signal received are over 100k)
- In this scenario, will cadence workflow throw any error because internal history of the workflow is too long to keep track of 100k+ signals received, despite having only hundreds of activities calls per guidelines in the doc?
It wasn't clear to me if cadence doc reference to 100k activities limits is applied only for total number of activity calls or total number of all state changes (activities, signals, query, etc).