I am getting the following error for the code below, please help:
from delta.tables import *
ModuleNotFoundError: No module named 'delta.tables'
INFO SparkContext: Invoking stop() from shutdown hook
Here is the code: '''
from pyspark.sql import *
if __name__ == "__main__":
spark = SparkSession \
.builder \
.appName("DeltaLake") \
.config("spark.jars", "delta-core_2.12-0.7.0") \
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
.getOrCreate()
from delta.tables import *
data = spark.range(0, 5)
data.printSchema()
'''
An online search suggesting verifying the scala version to delta core jar version. Here is the scala & Jar versions
"delta-core_2.12-0.7.0"
"Using Scala version 2.12.10, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_221"