So, I am trying to initialize SparkSession and SparkContext in python 3.6 using the following code:
from pyspark.sql import SparkSession
from pyspark import SparkContext
#Create a Spark Session
SpSession = SparkSession \
.builder \
.master("local[2]") \
.appName("V2 Maestros") \
.config("spark.executor.memory", "1g") \
.config("spark.cores.max","2") \
.config("spark.sql.warehouse.dir", "file:///c:/temp/spark-warehouse")\
.getOrCreate()
I get the following error everytime I try to do this:
module 'pyspark' has no attribute 'heapq3'
Please let me know where I am mistaken. I am pretty new to spark.