I am working with a complex Google Cloud Dataprep process. In order to make it work properly, I designed different modules and then linked them (with referenced datasets).
If I execute those modules separately they work properly, but when executing them together it gives an error and never starts working.
Creating the Dataflow job failed unexpectedly.
Do you know if there is a limitation in the number of recipes in Dataprep or the number of jobs/transformations in Dataflow?