1

I try to upload a few rows of data using the gcloud python library and don't succeed. Here is the sample code taken from the latest documentation

client = bigquery.Client()
dataset = client.dataset('test')
table = dataset.table("test_table")
rows = [("foo", "bar"), ("foo2", "bar2")]
result = table.insert_data(rows)

If I query the latest upload I get:

[(None, None), (None, None)]

So I add empty fields. In the documentation it says the uploaded rows should be "list of tuples", but that does not seem to work. My schema has two string fields. Unicode fields do not work either and I do not get any error result back either, which makes it difficult to debug. Any hint what I do wrong?

bossylobster
  • 9,993
  • 1
  • 42
  • 61
crisscross
  • 1,675
  • 2
  • 18
  • 28

1 Answers1

1

Explicitly declaring the schema in your table will help solve this problem. I.e., instead of using table = dataset.table('test_table'), you should use the following:

left = SchemaField('left', 'STRING', 'REQUIRED') right = SchemaField('right', 'STRING', 'REQUIRED') table = dataset.table('test_table', schema=[left, right])

I had opened an issue on Github regarding this. You can read more here if interested.

  • Thanks a lot. I ended up using the GoogleApi library (https://cloud.google.com/bigquery/docs/reference/v2/tabledata/insertAll) , because I already had a python dictionary and it made no sense to me to convert that to a set. Frankly, I cannot follow the approach behind gcloud library to work with sets and schema seperately. So much easier to work with a dic that combines the two. But thanks anyway for the solution. – crisscross Oct 01 '16 at 09:09
  • We get an error "No module named bigquery.cloud.schema" for the import statement you use "from google.cloud.bigquery.schema import SchemaField". Does this library exist or even work anymore? – Praxiteles Aug 27 '17 at 09:19