I need to read the certain version of delta file from the s3 storage and need
delta package for that.
When running the code with databricks-connect I'm getting the error:
ModuleNotFoundError: No module named 'delta.tables'
I've tried to change configs as it's suggested in Delta lake Quickstart docs
spark = pyspark.sql.SparkSession.builder.appName("MyApp") \
.config("spark.jars.packages", "io.delta:delta-core_2.12:0.8.0") \
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
.getOrCreate()
from delta.tables import *
but got no results.
databricks-connect version 7.3.7
Is there any other way to import delta module?