0

So, I am trying to initialize SparkSession and SparkContext in python 3.6 using the following code:

from pyspark.sql import SparkSession
from pyspark import SparkContext  

#Create a Spark Session
SpSession = SparkSession \
    .builder \
    .master("local[2]") \
    .appName("V2 Maestros") \
    .config("spark.executor.memory", "1g") \
    .config("spark.cores.max","2") \
    .config("spark.sql.warehouse.dir", "file:///c:/temp/spark-warehouse")\
    .getOrCreate()

I get the following error everytime I try to do this:

module 'pyspark' has no attribute 'heapq3'

Please let me know where I am mistaken. I am pretty new to spark.

Rahul Poddar
  • 343
  • 1
  • 4
  • 12

1 Answers1

1

I think there is an issue with python 3.6. please refer here. I recommend you to user older version of python now. Once the issue is fixed and tested, you can start using with python 3.6

koiralo
  • 22,594
  • 6
  • 51
  • 72