2

I am not able to write the delta table into minio.

I am running my spark as master and worker pods in Kubernetes. using Jupyter notebook as driver and minio for storage.

While writing the delta table it is failing

df1.write.partitionBy(['asset_id']).format("delta").mode("append").option("mergeSchema", "true").save("s3a://test/asset-table")

python version: 3.7 pyspark: 3.2.2 java JDK : 8

error:

23/01/04 07:37:12 WARN TaskSetManager: Lost task 0.0 in stage 5.0 (TID 12) (10.244.28.3 executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD

but I am able to write parquet files to minio but not delta table

df1.write.partitionBy(['asset_id']).format("delta").mode("append").option("mergeSchema", "true").save("s3a://test/asset-table")

Py4JJavaError: An error occurred while calling o195.save.
: org.apache.spark.SparkException: Job aborted.
  • 1
    post bigger stacktrace - you need to find lines starting with `Caused by` – Alex Ott Jan 04 '23 at 10:50
  • Also which version of Delta are you working with Spark 3.2.2? That is, are you using Delta 1.2, 1.2.1, or 2.0 (which are the ones compatible with Spark 3.2)? – Denny Lee Jan 05 '23 at 01:38
  • I am using delta lake 1.2.1 , following https://github.com/delta-io/delta/releases/tag/v1.2.1 – Gali Sai Surendra Jan 05 '23 at 05:53
  • Hi Alex Ott, Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 2.0 failed 4 times, most recent failure: Lost task 1.3 in stage 2.0 (TID 9) (10.244.28.3 executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2301) – Gali Sai Surendra Jan 05 '23 at 05:56

0 Answers0