I used to access data from a CSV file in my local directory using Jupyter Notebook, however, now I want access a CSV file that is stored in google cloud storage via datalab. This is the part of the function as I used to run it:
def function1(file_name):
new_file = open("file_name.csv", "w")
new_file.write("variable"+'\n')
with open(file_name, "r") as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
for row in csv_reader:
values_in_column1 = int(row[0])
variable = values_in_column1 * 0.6 / 5
How can I change this function in order to make it work with CSV files stored in google cloiud storage and in datalab?
Datalab gives me the possibility to load the data of a csv file into one variable, but I don't want to load all data into one variable. I want to load the values from each column into a different variable.
%%gcs read --object gs://bucket-name/file_name.csv --variable variable_name
Does anyone recommend using dictionaries or lists? Or is there an easier way to do this?
I have tried using storage from google.cloud, but I can't import it while I have been updating google cloud storage via my terminal.
ImportErrorTraceback (most recent call last)
<ipython-input-6-943e66fe7e46> in <module>()
----> 1 from google.cloud import storage
2
3 storage_client = storage.Client()
4 bucket = storage_client.get_bucket(bucket_name)
5 blob = bucket.blob(source_blob_name)
ImportError: cannot import name storage