For example, I have a table in bigquery with 10 million rows, I want to extract this table to Google Storage every 100 thousand rows. To make it clear, I want 100 csv files and each one have 100k distinct rows in the bigquery table.
bq extract --noprint_header dataeset.abigtable gs://bucket/output/*.csv
With the code above entered into gcloud shell, the table will be splitted into 10 or so files in google storage. However, I have no control of how many rows in each of the file. How could I control it?