I am trying to dockerize an app which makes an api call to bigquery for data, I have provided the credentials .json(trying to authenticate via oauth-service account) but what I am facing is when I run the container my app runs but it asks for authcode while when I run the script simply through jupyter from my laptop or Cloud functions(GCP) It makes use of .json and authenticates and gives the data.
Willing to deploy this container to Cloud run. What am I doing wrong here ? Any help would be great!
Sample method that I use to make api call to bigquery.
PS: not the algorithm code but this is simply the method I would want to work i.e an api call to bigquery. Facing the same issue in this code too.
def pfy_algorithm_1_1():
import pandas as pd
import numpy as np
import datetime
import requests
import json
from pandas import json_normalize
from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('mylo_bigquery.json')
project_id = 'xyz'
client = bigquery.Client(credentials= credentials,project=project_id)
user_data=query_big_query('''select * from dataset_id.table_id limit 5''')
destination_table1 = 'dataset-id.table-id'
if_exists='replace'
private_key='mylo_bigquery.json'
authcode = 'xyz1xyz23'
user_data.to_gbq(destination_table = destination_table1,
project_id = project_id,
chunksize=None,
reauth=False,
if_exists=if_exists,
auth_local_webserver=False,
table_schema=None)
DOCKER FILE:
#setting base image
FROM python:3
#setting the working directory in the container
WORKDIR /usr/src/app
#copy the dependencies file to working directory
COPY . .
#installing dependencies
RUN pip install -r requirements.txt
#command to run on container start
EXPOSE 8080
ENTRYPOINT ["python3","main.py"]