I have 2 scripts for GCS (Google Cloud Storage) to upload and download files. I installed google.cloud using pip install google
. My scripts were working, but now I see error:
ModuleNotFoundError: No module named 'google'
I read many questions here for same issue, but didn't find solution.
What I tried:
upgrading pip
pip uninstall google
andpip3 install google
re-installing VS Code
I tried following:
pip install virtualenv virtualenv venv source venv/bin/activate pip install google-cloud-storage
Here source venv/bin/activate
didn't run in Terminal
Can you please help me?
When I do pip list
I see:
Package Version
------------------------ ---------
beautifulsoup4 4.10.0
cachetools 4.2.4
certifi 2021.10.8
charset-normalizer 2.0.9
distlib 0.3.4
filelock 3.4.2
google 3.0.0
google-api-core 2.3.2
google-api-python-client 2.34.0
google-auth 2.3.3
google-auth-httplib2 0.1.0
google-cloud 0.34.0
google-cloud-core 2.2.1
google-cloud-storage 1.43.0
google-crc32c 1.3.0
google-resumable-media 2.1.0
googleapis-common-protos 1.54.0
httplib2 0.20.2
idna 3.3
pip 21.3.1
platformdirs 2.4.1
protobuf 3.19.1
pyasn1 0.4.8
pyasn1-modules 0.2.8
pyparsing 3.0.6
requests 2.26.0
rsa 4.8
six 1.16.0
soupsieve 2.3.1
uritemplate 4.1.1
urllib3 1.26.7
virtualenv 20.13.0
People asked my code. Here it is. (Although it's not needed to solve question) Code that I run:
Note: (it uploads to GCS. files have names like "Customer- ...", "Account- ...". The type of the file is uploaded to named accordingly Prefix in Bucket [bucket_name/Prefix_name] I called it file_type_name in a code)
from google.cloud import storage
import os
import glob
import json
import sys
def upload_files(config_file):
# Reading 3 Parameters for upload from JSON file
with open(config_file, "r") as file:
contents = json.loads(file.read())
# Setting up login credentials
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = contents['login_credentials']
# The ID of GCS bucket
bucket_name = contents['bucket_name']
# Setting path to files
LOCAL_PATH = contents['folder_from']
for source_file_name in glob.glob(LOCAL_PATH + '/**'):
# For multiple files upload
# Setting destination folder according to file name
if os.path.isfile(source_file_name):
partitioned_file_name = os.path.split(source_file_name)[-1].partition("-")
file_type_name = partitioned_file_name[0]
# Setting folder where files will be uploaded
destination_blob_name = file_type_name + "/" + os.path.split(source_file_name)[-1]
# Setting up required variables for GCS
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
# Running upload and printing confirmation message
blob.upload_from_filename(source_file_name)
print("File from {} uploaded to {} in bucket {}.".format(
source_file_name, destination_blob_name, bucket_name
))
upload_files(sys.argv[1])