0

This is probably a very easy question for many but driving me insane.

I've been looking at several of Google's documentation regarding uploading files via Google Storage, and can do it locally - but my problem is I can't get it to work in production on a GKE environment.

My upload code:


let KeyfilenamePath = ''

// if development 
keyFilenamePath = path.join(__dirname, '../my-path/keyfile.json');

// if production 
keyFilenamePath = ''

const gcBucket = gc.bucket({
  keyFilename: keyFilenamePath;
  projectId: 'my-project'
});

This works fine on the local environment, but I'm not sure how to get it to work in production reading Google's documentation.

I've tried omitting the keyFilename all together in production, but no luck. It's always the same error:

Error ApiError: Insufficient Permission at new ApiError (/app/node_modules/@google-cloud/common/build/src/util.js:59:15)

Now on my pod, the keyfile.json obviously doesn't exist, and I have not set up a GOOGLE_APPLICATION_CREDENTIALS. But from what I gather it should still work but I'm not sure how to get everything to cooperate with Googles ADC as stated in: https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-nodejs

It states:

If the environment variable isn't set, ADC uses the default service account that Compute Engine, Google Kubernetes Engine, Cloud Run, App Engine, and Cloud Functions provide, for applications that run on those services.

Should I be setting the json keyfile with kubernetes secrets? https://jamesdefabia.github.io/docs/user-guide/kubectl/kubectl_create_secret_generic/

I've been scouring for hours and not sure how to handle this.

Other resources:

Providing keyFilename of Google Client Service Account from Google Cloud Storage

Google Cloud Storage - insufficient permission

google plus api: "insufficientPermissions" error

How to determine authentication method while using Google Cloud Platform client libraries locally

chickadee
  • 301
  • 1
  • 2
  • 9
  • Google authentication can seem mysterious (or crazy) some times, but it is not. First, the right way is to use Google Default Credentials. So you are in the right way! Does it work locally? Is the problem only on GKE? Another question, how did you set up your GKE cluster? Special configuration/tools installed on it? – guillaume blaquiere Jul 12 '20 at 19:39
  • Check whether the Default compute engine account has correct rights to access the bucket. The service account is used by GKE to access other Google Services. – Tarun Khosla Jul 13 '20 at 10:09
  • Check your Stackdriver logs in order to see which account is being used for accessing GCS and is sending the error. This could help you delimit the account and be able to focus on it's permissions – rsalinas Jul 13 '20 at 10:19

0 Answers0