0

I found bizarre behavior in Jupyter notebook, when I load my own csv. On one GCE it works, on the other it works fine. Probably, I need to set up my instance better - maybe, I've missed something trivial here.

My code goes like this (look below) - most is basically tutorial stuff. The bucket and filenames are good because it works in different instance. The new GCE has more RAM and more CPUs. But I don't think this should be the problem. Any ideas what could go wrong? (the issue is on python 2 and 3)

from google.datalab import Context
import google.datalab.bigquery as bq
import google.datalab.storage as storage
import pandas as pd
try:
  from StringIO import StringIO
except ImportError:
  from io import BytesIO as StringIO
%%gcs read --object gs://ls_w/tmax_out_save.csv --variable bla  # crashes here

I've also tried this

bla = storage.Object('ls_w', 'tmax_out_save.csv').read_stream() # crashes here
Lukasz
  • 269
  • 4
  • 12
  • Can you provide the error? There's little we can do to help without the full context. – yelsayed Mar 23 '18 at 18:05
  • Sorry, I can't give you error. It was usual kernel restart... It was very frustrating as I couldn't trace the error. I have resolved the issue by creating a new instance. I guess there was an error, when I created the instance. – Lukasz Mar 26 '18 at 07:55

1 Answers1

0

I have resolved the issue by creating a new instance. I guess there was an error, when I created the instance.

Lukasz
  • 269
  • 4
  • 12