I have been stuck on this problem for the past two days. I am wanting to use Lambda as Cron operation to get data from my database and post it to BigQuery.
I would like to know how to authorize access to BigQuery using a services account file from my Lambda function.
Context
I am using the following:
- A Serverless repo to deploy my Lambda.
- The BigQuery SDK @google-cloud/bigquery
Attempts
- Everything has worked fine on my local using
sls invoke local --function main
. I have setGOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
in my.env
. But,obviously/path/to/key.json
is a local path. - As a test, I tried just putting my service account file into the root directory of my project and set
GOOGLE_APPLICATION_CREDENTIALS=./key.json
(notice I used a relative path). This does not work locally or in cloud. And yes, I know it is not good practice -- I am just trying to get it working. I believe this may be a WebPack thing, although I am totally clueless on how to use WebPack. - I have also thought about perhaps using AWS KMS to encrypt the json and store it as a key-value pair in parameter store (which is the way I eventually want to use it). But, I noticed that BigQuery requires to take in a filepath and not the secret itself See here.
Question
So here is my question:
- Is there to connect to BigQuery using the Serverless Framework using a filepath and the
.env
file? - Why Google does not just allow you to use access keys to connect to BigQuery? Rather I must specify path to my file.
- Is there a way to store a file in KMS and decrypt it upon deployment to Lambda?