0

I am trying to connect Hive and execute query in shell script triggered by oozie, the server is kerberos enabled. I am passing hive credentials in workflow but I am still getting the error output whenever hive script is getting executed,

Error sample:

Logging initialized using configuration in file:/etc/hive/2.6.5.274-2/0/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can be issued only with kerberos or web authentication
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:7130)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:676)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:1011)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)

sample shell script being called: shell-script.sh

hive -e "create table db.table as select * from db2.table2;" --<- giving error here.

My workflow:

<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="report_wf">
        <credentials>
      <credential name='hcatauth' type='hcat'>
         <property>
            <name>hcat.metastore.uri</name>
            <value>${hcat_metastore}</value>
         </property>
         <property>
             <name>hcat.metastore.principal</name>
             <value>${hcat_principal}</value>
         </property>
      </credential>
    </credentials>
 <start to="shell-node"/>
    <action name="shell-node"  cred="hcatauth">
        <shell xmlns="uri:oozie:shell-action:0.1">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <exec>${script_path}</exec>
            <file>${script_path}#${script_name}</file>
             <capture-output/>
        </shell>
        <ok to="success_mail"/>
        <error to="error_mail"/>
    </action>
    <action name="success_mail">
        <email xmlns="uri:oozie:email-action:0.2">
            <to>${email_list}</to>
            <cc>${cc_list}</cc>
            <subject>Table report</subject>
            <body>Table report has been generated.</body>
        </email>
        <ok to="end"/>
        <error to="error_mail"/>
    </action>
    <action name="error_mail">
        <email xmlns="uri:oozie:email-action:0.2">
            <to>${error_email_list}</to>
            <subject>report job failed</subject>
            <body>report job failed</body>
        </email>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>Shell action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
 <end name="end"/>
</workflow-app>

I have tried to give kinit to initialize kerberos ticket in top of shell script or placing the keytab file in hdfs location and adding below line under step in workflow as I found in different answers for fix like below:

<file>name.keytab#name.keytab</file>

and then tried to execute but still the same issue I am getting every time. I am very new to oozie and how it works in kerberos environment. I am facing a lot of issues and not able to execute my pipeline ever since this issue starts to occur. Any help will be really appreciated, thankyou.

Mohammad Rijwan
  • 335
  • 3
  • 17
  • 2
    Using a _shell_ action to execute a Hive query in Oozie is simply **nonsense**. There are _hive_ (legacy, using deprecated fat client that communicates with Metastore) and _hive2_ (using thin JDBC client to HiveServer2) actions, with built-in handling of delegation tokens (to Metastore and HS2 respectively). – Samson Scharfrichter Dec 14 '21 at 14:58
  • Thanks @samson I used the same and it worked. – Mohammad Rijwan Dec 14 '21 at 16:38

0 Answers0