1

I need to read the certain version of delta file from the s3 storage and need delta package for that.
When running the code with databricks-connect I'm getting the error:

ModuleNotFoundError: No module named 'delta.tables'

I've tried to change configs as it's suggested in Delta lake Quickstart docs

spark = pyspark.sql.SparkSession.builder.appName("MyApp") \
    .config("spark.jars.packages", "io.delta:delta-core_2.12:0.8.0") \
    .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
    .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
    .getOrCreate()

from delta.tables import *

but got no results.
databricks-connect version 7.3.7

Is there any other way to import delta module?

Riva Dan
  • 47
  • 6
  • 1
    Thanks for response. I've downloaded delta core jar file and placed it into pyspark/jars folder. Now I can import module, but when calling "DeltaTable.forPath" function I'm getting "java.lang.VerifyError: Cannot inherit from final class" error. – Riva Dan Feb 10 '21 at 08:42

0 Answers0