0

I am trying to use bq command inside datalab, but when I try to use it, it pops up a message and require me to set up my credentials, as shown below,

enter image description here

However, I can't paste the verification code into the pop message. Basically, I was stuck on this step.

  1. I checked my gcloud configurations in datalab, it shows me the right project and account.
  2. I can use bq commands in my terminal, which works fine, no need to authorize every time.

Any idea how to solve this? Thank you.

Haipeng Su
  • 2,341
  • 2
  • 15
  • 30

1 Answers1

2

In the long term, I think we will save the Datalab credentials for BQ. then after signing in Datalab, !bq will use saved credentials.

As a workaround, you can run bq outside Datalab with interactive command prompt, go through the token verification process, and then copy the token file into Datalab. For example, assume you have done the verification process on the machine hosting Datalab, you can copy the creds file into Datalab by running the following cell:

!cp /content/.bigquery.v2.token ~/.bigquery.v2.token

And you need to do it every time you restart Datalab.

Bradley Jiang
  • 424
  • 2
  • 1
  • Hi, Bradley, thx for the answer. I am a little bit confused about the verification process. I normally use `gcloud auth login` to authorize my account. But I didn't get any file /content/.bigquery.v2.token. Besides, I authorized my account inside the docker images, and in the command line tool, I can use bq command, but back into Datalab, it is still not working. and this time, it doesn't give me the verification message, but keeping running and no message and result at all. – Haipeng Su Jan 23 '17 at 20:34
  • At my side, if I run "bq show" the first time, it asks for verification. After that is done, $HOME/bigquery.v2.token file is created. What subcommand did you run in Datalab? Note that "bq show" without dataset name takes long time to come back. If I run "bq show [MyDatasetName]" it comes back very quickly. – Bradley Jiang Jan 24 '17 at 17:22
  • 1
    Hi, @Bradley, `bq show my_dataset` is the command I use to test. I tried today again, and it works. I suppose once I verified my account in the datalab docker container, I need to restart the localhost or datalab container to get it working. Since everytime I restart my computer, I start the same container from last time, that is why my credential still exists. However, I can't find bigquery.v2.token anywhere inside the containers, only some config files and credential files under /content/datalab/.config – Haipeng Su Jan 25 '17 at 14:52
  • I am curious how you ran "gcloud auth login" inside Datalab to verify your account? There is no stdin in Datalab so I don't know how you input your verification id? The issue might be that bq command does not honor customized CLOUDSDK_CONFIGconfig location. If you ran "gcloud auth login", the config should be saved in standard location and bq knows where to pick it up. – Bradley Jiang Jan 25 '17 at 21:40
  • Firstly, I named The docker container that runs datalab as 'datalab' , then I can get into that container by using command `docker exec -t -i datalab /bin/bash`. Thirdly, I can run `gcloud auth login` just in the Terminal to verify my account using the pop-up window and code. – Haipeng Su Jan 25 '17 at 23:19