1

I am running the following:

bq query --format=csv SELECT GKGRECORDID, DATE,SourceCommonName,DocumentIdentifier, V2Persons, V2Tone, TranslationInfo, from [gdelt-bq:gdeltv2.gkg_partitioned]where V2Persons like "%Orban%" and _PARTITIONTIME >= TIMESTAMP("2016-11-09") and _PARTITIONTIME < TIMESTAMP("2016-11-11")' > outputfile.csv

This should return a table with roughly 1000 rows (which I get when I use the normal bigquery interface in the browser). However, when I run this using the api, it will only return 100.

It seems like an issue with the size of the buffer, but I thought I'd ask if there was something that could be done on the bigquery side (for example, a way to send query output in several chunks) to remedy this.

Thanks!

blong
  • 2,815
  • 8
  • 44
  • 110
Samuel Markson
  • 97
  • 1
  • 2
  • 8

1 Answers1

1

On the command line you can specify how many rows to be returned, defaults to max 100.

bq query -n 1500

Please be aware that maximum return size is 128MB compressed regardless of rows requested.

Pentium10
  • 204,586
  • 122
  • 423
  • 502