0

I am trying to append data to BQ table using python code which requires dynamic schema handling. Can anyone provide me the link to handle above scenario.

codninja0908
  • 497
  • 8
  • 29
  • Can you share an example of your code and give more details of the issue you are having? – Ben P Jul 03 '19 at 09:56
  • I want to write a row in a .csv file each time I called the write a file function , which then should update only the new row in BQ table from this file – codninja0908 Jul 07 '19 at 12:09

1 Answers1

0

An example code of loading a .csv file into BigQuery using the python client library:

# from google.cloud import bigquery
# client = bigquery.Client()
# filename = '/path/to/file.csv'
# dataset_id = 'my_dataset'
# table_id = 'my_table'

dataset_ref = client.dataset(dataset_id)
table_ref = dataset_ref.table(table_id)
job_config = bigquery.LoadJobConfig()
job_config.source_format = bigquery.SourceFormat.CSV
job_config.skip_leading_rows = 1
job_config.autodetect = True

with open(filename, "rb") as source_file:
    job = client.load_table_from_file(source_file, table_ref, job_config=job_config)

job.result()  # Waits for table load to complete.

print("Loaded {} rows into {}:{}.".format(job.output_rows, dataset_id, table_id))

Also check this part of the documentation to know more about appending data into tables from a source file using the same or different schema.

Mayeru
  • 1,044
  • 7
  • 12