1

we are using pyspark inside Watson Studio to connect to a Spark instance running on IBM Cloud. Now we want to run the python code in an IBM Cloud Function, but the SparkContext is missing. Inside Watson Studio, the Studio itself is taking care of creating the SparkContext. What needs to be done to create a proper SparkContext? Where do the values come from?

Thx

tdeer
  • 135
  • 1
  • 6
  • Where is your Spark instance running? You will most likely need to use Spark Submit to execute code there from your Cloud Function, if that's your goal. If you are just trying to use the PySpark APIs it might make more sense not to use Spark. – Greg Filla Jan 23 '19 at 15:38

0 Answers0