0

I failed to submit the training operation on the AI platform of GCP.The error is "xxx@gmail.com does not have storage.objects.create access to your-bucket-name/fcnndemo/trainer/packages/980a4aa0a09719cf43f04580d8e6c218346e3ad085e3f48fd11b79ec57a702fe/ai_platform_demo-0.0.0.tar.gz."

I am trying to use the data in GEE and submit it to the AI platform for training.I'm running it on a Colab notebook.

import time

# INSERT YOUR PROJECT HERE!
PROJECT = 'your-project'

JOB_NAME = 'demo_training_job_' + str(int(time.time()))
TRAINER_PACKAGE_PATH = 'ai_platform_demo'
MAIN_TRAINER_MODULE = 'ai_platform_demo.task'
REGION = 'us-central1'

!gcloud ai-platform jobs submit training {JOB_NAME} \
    --job-dir {config.JOB_DIR}  \
    --package-path {TRAINER_PACKAGE_PATH} \
    --module-name {MAIN_TRAINER_MODULE} \
    --region {REGION} \
    --project {PROJECT} \
    --runtime-version 1.14 \
    --python-version 3.5 \
    --scale-tier basic-gpu

why do not I have storage.objects.create access?

  • Do you know your current role? Your current connected user? Perform this command `!gcloud config list` to know more about this. We agree that you changed the values (`your-bucket-name` and `your-project`) only for this post, but in reality, these values are correct and you can access to these resources? – guillaume blaquiere Nov 01 '19 at 13:10

1 Answers1

0

In order to have storage.objects.create permissions you need to grant it to your user using Cloud IAM permissions. In this link you will find instructions on how to control who has access to your buckets and objects.

In order to test it out I have successfully submitted a training job from Google Colab following this. Make sure to login with your user with the command '!gcloud auth login'.

The code I have used is the following:

!gcloud auth login
BUCKET_NAME=’<<YOUR_BUCKET>>’
REGION='europe-west1'
JOB_NAME='test_job'
JOB_DIR='gs://<<YOUR_BUCKET>>/keras-job-dir'

!git clone --depth 1 https://github.com/GoogleCloudPlatform/cloudml-samples
!cd cloudml-samples/census/tf-keras/ && ls -pR && pip install -r requirements.txt


!gcloud ai-platform jobs submit training $JOB_NAME \
  --package-path cloudml-samples/census/tf-keras/trainer/ \
  --module-name trainer.task \
  --region {REGION} \
  --python-version 3.5 \
  --runtime-version 1.13 \
  --job-dir $JOB_DIR \
  --stream-logs

Community
  • 1
  • 1
Joaquim
  • 406
  • 2
  • 10