0

I am running Spark-Shell with Scala and I want to set an environment variable to load data into Google bigQuery. The environment variable is GOOGLE_APPLICATION_CREDENTIALS and it contains /path/to/service/account.json

In python environment I can easily do,

import os 
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = "path/to/service/account.json"

However, I cannot do this in Scala. I can print out the system environment variables using,

scala> sys.env

or

scala> System.getenv()

which returns me a map of String Key,Value pairs. However,

scala> System.getenv("GOOGLE_APPLICATION_CREDENTIALS") = "path/to/service/account.json"

returns an error

<console>:26: error: value update is not a member of java.util.Map[String,String]
Regressor
  • 1,843
  • 4
  • 27
  • 67

1 Answers1

0

I found a work around for this problem, though I dont think its the best practice. Here is the 2 step solution for this -

  1. From terminal/cmd, first create the environment variable -

    export GOOGLE_APPLICATION_CREDENTIALS=path/to/service/account.json

  2. From the same terminal window, open spark-shell and run -

    System.getenv("GOOGLE_APPLICATION_CREDENTIALS")

Regressor
  • 1,843
  • 4
  • 27
  • 67