2

I am running the following command from a local terminal:

bq mk --transfer_config --target_dataset=mydataset --display_name='mytransfer' --params='{ 
"data_path": "s3://mys3path/*",
"destination_table_name_template": "mytable",
"file_format": "JSON",
"max_bad_records":"0",
"ignore_unknown_values":"true",
"access_key_id": "myaccessid",
"secret_access_key": "myaccesskey"
}' --data_source=amazon_s3

Now, every time I run this, I get the following:

/opt/google-cloud-sdk/platform/bq/bq.py:41: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses
  import imp
Table '<mytablehere>' successfully created.
/opt/google-cloud-sdk/platform/bq/bq.py:41: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses
  import imp

https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=***********.apps.googleusercontent.com&scope=https://www.googleapis.com/auth/bigquery&redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info
Please copy and paste the above URL into your web browser and follow the instructions to retrieve a version_info.
Enter your version_info here: 

So, every time I run this, I need to open this link, sign-in my account, authorize Google data transfer service to "View and manage your data in Google BigQuery and see the email address for your Google Account" and then copy/paste back to the terminal a string that I get in the browser.

Is there any way to persist the version configuration so that I don't have to perform this step every time?

Thank you in advance

geo909
  • 394
  • 1
  • 6
  • 18

2 Answers2

1

I got around the following prompt:

https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?client_id=123456789012-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.apps.googleusercontent.com&scope=https://www.googleapis.com/auth/bigquery%20https://www.googleapis.com/auth/drive&redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info
Please copy and paste the above URL into your web browser and follow the instructions to retrieve a version_info.
Enter your version_info here:

Using the following steps:

  1. Create a service account.
  2. Assign the service account the role ""roles/bigquery.admin".
  3. Create a JSON key for the service account.
  4. Download the key file.
  5. Use "gcloud auth activate-service-account" to login as the service account.
  6. Run the "bq query" or "bq mk" command with the parameter "--service_account_credential_file=service-account-key-file.json".
  7. User "gcloud auth revoke" to logout from the service account.

That worked for me and avoided the prompt but I got there by pure trial and error. I couldn't find any GCP documentation to support this approach, quite the opposite most of their documentation seems to casually mention the prompt as if it's not something we'd like to avoid.

FreeZey
  • 2,382
  • 3
  • 11
  • 23
0

In order to have your Service Account’s credentials to persist within the BigQuery command line tool, so that you can use it after logging out then logging in again, you will need to set the CLOUDSDK_PYTHON_SITEPACKAGES environment variable by running the following command:

export CLOUDSDK_PYTHON_SITEPACKAGES=1

You can then run the following command to see the accounts the tool has credentials for, which should include your Service Account:

gcloud auth list

I hope that the above information would be helpful. If it does not, make sure that you try out the steps followed in the Stackoverflow case.

Make sure to try out the .bigqueryrc solution provided by Michael Sheldon.

Mousumi Roy
  • 609
  • 1
  • 6