0

I am performing Mysql to bigquery data migration using jdbc to bigquery template in dataflow.

But while performing "select * from teable1" command on mysql, i also want to insert the selected data to another table in same database for some reason.

How can i perform both select and insert queries in dataflow template? I got error when used semicolon between two queries.

Joseph N
  • 540
  • 8
  • 28

2 Answers2

0

The Jdbc to Bigquery template will write all data you read to the table specified under "Bigquery output table" (<my-project>:<my-dataset>.<my-table>), so there is no need to write the insert statement.

(The parameter is "outputTable" for gcloud/REST)

Peter Kim
  • 1,929
  • 14
  • 22
  • @Dhairyaa The Jdbc "to Bigquery" template is not for your use-case. The mysql query is only used to extract the data. https://cloud.google.com/dataflow/docs/guides/templates/provided-batch?hl=en_US#java-database-connectivity-jdbc-to-bigquery – Peter Kim Nov 10 '20 at 17:23
  • Actually, I want to perform incremental load. So for keeping track of migrated data , I am saving there primary key in different table. That is why I want to perform insert in source database – Joseph N Nov 10 '20 at 17:27
0

As @PeterKim mentioned the JDBC to BigQuery termplate could be not the best approach for your use case.

You could try to use that template as reference and modify it to write into MySQL, in this post you will find an implementation about how to make an insert into MYSQL database.

After modifying the pipeline source code you can create a custom template.

Enrique Zetina
  • 825
  • 5
  • 16