I am trying to read a csv from google cloud storage in Google Cloud Datalab but its falling over with the following error message:
UnicodeDecodeErrorTraceback (most recent call last)
<ipython-input-19-ba47ebe88c7a> in <module>()
17 fn = generate_input_fn(filename)
18 k,v = fn()
---> 19 csv.browse(max_lines = 1000)
20
21
/usr/local/lib/python2.7/dist-packages/datalab/data/_csv.pyc in browse(self, max_lines, headers)
89 """
90 if self.path.startswith('gs://'):
---> 91 lines = Csv._read_gcs_lines(self.path, max_lines)
92 else:
93 lines = Csv._read_local_lines(self.path, max_lines)
/usr/local/lib/python2.7/dist-packages/datalab/data/_csv.pyc in _read_gcs_lines(path, max_lines)
56 @staticmethod
57 def _read_gcs_lines(path, max_lines=None):
---> 58 return datalab.storage.Item.from_url(path).read_lines(max_lines)
59
60 @staticmethod
/usr/local/lib/python2.7/dist-packages/datalab/storage/_item.pyc in read_lines(self, max_lines)
200 content = self.read_from(byte_count=bytes_to_read)
201
--> 202 lines = content.split('\n')
203 if len(lines) > max_lines or bytes_to_read >= max_to_read:
204 break
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 35587: ordinal not in range(128)
It seems that this is because the file is not encoded in ascii but that the csv parser assumes ascii.
Is there a way I can have the csv parser datalab provides use a different codec other than ascii?
I had a look at the docs but couldn't see a way to pass a codec as a parameter.