-1

I need to access a json file in GCS (Google Cloud Storage) and change its content from Google App Engine (PY3 standard runtime). Though I can read the content of the file in GCS by:

from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.bucket('bucket_name')
blob = bucket.blob('file_name')
file_content = blob.download_as_string().decode('utf-8')

I don't know how to write directly to the GCS file. It is also okay if I can delete the GCS file and re-upload one created in some temp folder, but it looks GAE file system is read-only and I cannot create files on the server side. Could you help me? Thank you very much!

Yiyin
  • 75
  • 2
  • 9

3 Answers3

1

You can use following code to upload the file

from google.cloud.storage import Blob

client = storage.Client(project="my-project")
bucket = client.get_bucket("my-bucket")
blob = Blob("secure-data", bucket)
with open("my-file", "rb") as my_file:
    blob.upload_from_file(my_file)
Vikram Shinde
  • 1,022
  • 6
  • 17
1

I finally used Blod.upload_from_string() to avoid creating temp files.

Yiyin
  • 75
  • 2
  • 9
0

You need to use the Client Library that currently supports Python 2 and 3.

An example on how to upload a file is below

from gcloud import storage
from oauth2client.service_account import ServiceAccountCredentials
import os


credentials_dict = {
    'type': 'service_account',
    'client_id': os.environ['BACKUP_CLIENT_ID'],
    'client_email': os.environ['BACKUP_CLIENT_EMAIL'],
    'private_key_id': os.environ['BACKUP_PRIVATE_KEY_ID'],
    'private_key': os.environ['BACKUP_PRIVATE_KEY'],
}
credentials = ServiceAccountCredentials.from_json_keyfile_dict(
    credentials_dict
)
client = storage.Client(credentials=credentials, project='myproject')
bucket = client.get_bucket('mybucket')
blob = bucket.blob('myfile')
blob.upload_from_filename('myfile')
Samuel Romero
  • 1,233
  • 7
  • 12