2

Does anyone have such a problem when using Hudi to integrate with the spark shell? I just started learning Hudi by the official document. The version of the environment is CDH-5.16.2,spark-2.3.0.

import org.apache.hudi.config.HoodieWriteConfig._
warning: Class org.apache.parquet.hadoop.metadata.CompressionCodecName not found - continuing with a stub.
import org.apache.hudi.config.HoodieWriteConfig._

When insert into Hudi,the error messages appears:

warning: there was one deprecation warning; re-run with -deprecation for details
Caused by: java.lang.NoClassDefFoundError: org/apache/parquet/hadoop/metadata/CompressionCodecName
shiwei
  • 21
  • 2
  • 1
    This seems like you are missing some dependency jar packages, most probably **org.apache.hudi:hudi-spark-bundle_2.xx:xx** and **org.apache.spark:spark-avro_2.xx:xx**. Add them via **--conf** or in SparkSession builder via `config("spark.jars.packages": <>)` option – Felix K Jose Jun 19 '21 at 14:01

0 Answers0