0

From my workstation I can fire templated Dataflow jobs with the gcloud dataflow jobs command. The required authorization to insert a new job come from my workstation where I'm logged in.

On the Compute Engine instance I rely on it's service account. The one with (number)-compute@. Within the AIM section I enabled Dataflow/Dataflow Admin, Dataflow/Dataflow Developer and Dataflow/Dataflow Worker for this service account to be safe. I even added Cloud Dataflow Service Agent when I came across that one.

Then I try to start a Dataflow from the command line but I get an error about insufficient authentication scopes: ERROR: (gcloud.dataflow.jobs.run) PERMISSION_DENIED: Request had insufficient authentication scopes.

If I do a gcloud config auth and login with my personal account, of course, it works. Somehow I'm missing the proper permissions to set to the applied service account.

Is there a guideline I missed? Can somebody please point me into the right direction?

Martin van Dam
  • 563
  • 6
  • 14

1 Answers1

0

The error message indicates that the instance does not setup access scope properly. To launches a job from a GCE VM, the VM must have compute.read-only, compute, or cloud-platform scope for the project.

The way to verify it is using the command "gcloud compute instances describe --zone=[zone][instance]" and look for "scopes".

This document and this existing question may provide useful guidelines for you.

Andy Xu
  • 101
  • 3