0

I am trying to schedule oozie workflow, with spark action and enabled Hive Support. when it was plain spark job without hive support that time actions ran properly. After adding hive support I can run spark job by spark-submit. but when I am trying to run in oozie its failing for

Unable to instantiate SparkSession with Hive support because Hive classes are not found.

Below is the code to create spark session:

static SparkSession initializeSparkSession() {
    SparkSession sparkSession = SparkSession.builder().appName("DataLoad").enableHiveSupport().getOrCreate();
    
    sparkSession.sparkContext().conf().set("spark.sql.sources.partitionOverwriteMode", "dynamic");
    sparkSession.sparkContext().conf().set("hive.exec.dynamic.partition.mode", "nonstrict");
    
    return sparkSession;
}

Below are the dependencies:

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

     <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

Below is the oozie workflow action :

<action name="data_load">
    <spark xmlns="uri:oozie:spark-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <master>yarn</master>
        <mode>cluster</mode>
        <name>DataMovement</name>
        <class>{package}.Job</class>
        <jar>${sparkJarPath}/s3_etl-0.0.1.jar</jar>
        <spark-opts>--files=/etc/spark/conf/hive-site.xml --conf spark.yarn.dist.files=file:/etc/spark/conf/hive-site.xml</spark-opts>
        <arg>${market}</arg>
        <arg>${market_lag}</arg>
        <arg>${data_bucket}</arg>
        <arg>${trigger_bucket}</arg>
        <arg>ALL</arg>
    </spark>
    <ok to="notifyJobSuccess" />
    <error to="notifyJobFailure" />
</action>

Is there something I need to add more in share-lib directory or remove anything.

-- Edited --- Above error comes if I didn't added hive in global property. If we add hive in global property

<global>
    <job-tracker>${jobTracker}</job-tracker>
    <name-node>${nameNode}</name-node>
    <configuration>
        <property>
            <name>mapred.job.queue.name</name>
            <value>${queueName}</value>
        </property>
        <property>
            <name>oozie.action.sharelib.for.spark</name>
            <value>spark,oozie,hive</value>
        </property>
    </configuration>
</global>

Then if throws another exception

ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
Kalpesh
  • 694
  • 2
  • 8
  • 28

0 Answers0