As a part of the POC I was trying to setup some data quality checks through Dataprep. There is a BigQuery table as a source and it should run a job with output to another BigQuery. Unfortunately that job fails with error:
java.lang.RuntimeException: Failed to create job with prefix beam_load_[thenameofthejob], reached max retries: 3, last failed job: null.
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:196)
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:149)
at org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:255) .
Do you have any hints how to solve this, please?
I have edited recipes and did not setup any transformation there just to see if the job runs - it failed again. It works when I output this dataprep dataflow into csv. It is all running in EU region.