0

I'm trying to trigger a batch dataflow job via API so I'm trying to get it working in the API explorer tool here. Unfortunately, the docs do not specify what the required parameters are, so I took a few stabs, but I can't get anything but this response:

{
  "error": {
    "code": 400,
    "message": "Request contains an invalid argument.",
    "status": "INVALID_ARGUMENT"
  }
}

The payload needs to be this Job, but that doc is an exhaustive list of the Job properties of an already triggered job. Nonetheless, I took a stab:

{
    "projectId": "my-project",
    "location": "us-central1",
    "name": "some-random-name",
    "type": "JOB_TYPE_BATCH"
}

But I can't get the api explorer to return anything helpful.

Has anyone gotten this to work?

nomadic_squirrel
  • 614
  • 7
  • 21

1 Answers1

2

The jobs.create API isn't designed to be used in this way - instead, create a templated Dataflow pipeline using the Apache Beam SDK and run your job via the API with the templates.create API.

Ryan M
  • 344
  • 1
  • 6
  • 1
    Darn. I'm using reads from BigQuery and when using a template I get a [NullPointerException](https://stackoverflow.com/questions/44718323/apache-beam-with-dataflow-nullpointer-when-reading-from-bigquery/45083493#45083493). I might need to go back to the drawing board. – nomadic_squirrel Jul 12 '18 at 16:21