2

This is more of a question about the possibility because there is no documentation on GCP regarding this. I am using BigQuery DTS for moving my CSVs from a GCS bucket to BQ Table. I have tried it out manually and it works but I need some automation behind it and want to implement it using Terraform. I have already checked this link but it doesnt help exactly: https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_data_transfer_config

Any help would be appreciated. Thanks!

Sam
  • 161
  • 1
  • 9
  • hi, not sure if this might be of interest https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/storage_transfer_job – jspcal Dec 08 '21 at 21:26
  • Thanks for the suggestion @jspcal, I have already gone through this link as well however this is the storage transfer, what I need is this : https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#console – Sam Dec 08 '21 at 21:29

1 Answers1

4

I think the issue is that the documentation does not list the params. Here is an example. Compare with the API.

https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq

resource "google_bigquery_data_transfer_config" "sample" {
  display_name           = "sample"
  location               = "asia-northeast1"
  data_source_id         = "google_cloud_storage"
  schedule               = "every day 10:00"
  destination_dataset_id = "target_dataset"
  params = {
    data_path_template              = "gs://target_bucket/*"
    destination_table_name_template = "target_table"
    file_format                      = "CSV"
    write_disposition               = "MIRROR"
    max_bad_records                 = 0
    ignore_unknown_values           = "false"
    field_delimiter                  = ","
    skip_leading_rows               = "1"
    allow_quoted_newlines           = "false"
    allow_jagged_rows               = "false"
    delete_source_files              = "false"
  }
}
John Hanley
  • 74,467
  • 6
  • 95
  • 159
  • Thanks for the suggested solution, I understand that for this particular example we should have a bigquery destination table (as well as the dataset) ready beforehand right?@John Hanley – Sam Dec 09 '21 at 05:17
  • 1
    @sam - please create a new post for new questions. If my answer has helped you please accept it. – John Hanley Dec 09 '21 at 17:36
  • @Jon Hanley your answer has helped me reach the final solution thanks, but my 'new' question was related to my actual question and an understanding of your suggested solution. Nevertheless I have accepted your suggested answer. Thanks :) – Sam Dec 10 '21 at 03:47
  • 1
    Your question is related, but do not ask additional questions in the comment section. In your case, I don't know the answer. By posting a new question, another person can answer. On Stack Overflow, the guidance is one question per post. – John Hanley Dec 10 '21 at 06:05