1

I want to load table from file and variable . As the file schema is not same as table to be loaded hence extra columns needs to be filled by variable inside stored procedure.

Like below example pty is not part of csv file and other 2 columns mt and de are part of file.

set pty = 'sss';
LOAD DATA INTO `###.Tablename`
  (
   pty STRING ,
   mt INTEGER ,
   de INTEGER
   ) 
FROM FILES
(
  format='CSV',
  skip_leading_rows=1,
  uris = ['gs://###.csv']
);

Mudgal
  • 35
  • 4

1 Answers1

1

I think you can do that on 2 steps and 2 queries :

LOAD DATA INTO `###.Tablename`
FROM FILES
(
  format='CSV',
  skip_leading_rows=1,
  uris = ['gs://###.csv']
);

update `###.Tablename`
set pty = "sss"
where pty is null;

If it's complicated for you to apply your logic with Bigquery and SQL, you can also create a Python script with Google Biguery client and Google storage client.

  • You script loads the csv file
  • Transforms results to a list of Dict
  • Add extra fields to each element of the Dict with your code logic
  • Load the result Dicts to Bigquery
Mazlum Tosun
  • 5,761
  • 1
  • 9
  • 23
  • thanks @Mazlum , yeah I have already implemented with update statement . I was just wondering if this can be achieved as single step – Mudgal Sep 05 '22 at 01:36