I have a simple test application to set session with Hive (simple authentication).
CliSessionState localSession = new CliSessionState(hiveConf)
localSession.setIsHiveServerQuery(true)
localSession.setCurrentDatabase("LOOM")
Driver driver = new Driver(hiveConf)
SessionState.start(localSession) ///////////////////////////// 1.
List data = new ArrayList()
CommandProcessorResponse response = driver.run("SHOW TABLES") // 2.
driver.getResults(data)
data.each {
println "${it}" // I skipped table list in log output
}
driver.close()
if (localSession != null) {
localSession.close() ///////////////////////////////// 3.
}
It works ok. I have got a list of tables in database LOOM
. No errors or something. However, log output looks suspicious. There are a lot of lines like:
org.apache.hadoop.security.UserGroupInformation - Failed to get groups for user hive by java.io.IOException: No groups found for user hive
Does it indicate a problem in my code, hive settings or something? And what I should to do to rid of this messages?
The following is the app log. Rows with comments 1.
, 2.
, 3.
, produce respective line blocks:
1.
org.apache.hadoop.security.UserGroupInformation - hadoop login org.apache.hadoop.security.UserGroupInformation - hadoop login commit org.apache.hadoop.security.UserGroupInformation - Using user: "hive" with name hive org.apache.hadoop.security.UserGroupInformation - User entry: "hive" org.apache.hadoop.security.UserGroupInformation - Assuming keytab is managed externally since logged in from subject. org.apache.hadoop.security.UserGroupInformation - UGI loginUser:hive (auth:SIMPLE) org.apache.hadoop.security.UserGroupInformation - Failed to get groups for user hive by java.io.IOException: No groups found for user hive org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
2.
org.apache.hadoop.security.UserGroupInformation - Failed to get groups for user hive by java.io.IOException: No groups found for user hive org.apache.hadoop.security.UserGroupInformation - Failed to get groups for user hive by java.io.IOException: No groups found for user hive org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
3.
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
Update:
The log is a STDOUT/STDERR of this application I locally started in intellij idea (hadoop/hive installed on cluster I connect with *-site.xml's)
I actually have a local user
hive
in grouphadoop
on the server.I set a breakpoint on
IOException
in intellij idea and switch on output of stacktrace. The following is a stacktrace for my specific exception instance:Breakpoint reached at org.apache.hadoop.security.Groups.getGroups(Groups.java:210) at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1721) at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:64) at org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator.setConf(ProxyUserAuthenticator.java:47) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:441) at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:759) at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1543) at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1554) at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:635) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:510) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:320) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1219) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1260) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1156) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1146) at org.apache.hadoop.hive.ql.processors.CommandProcessor$run.call(Unknown Source:-1)
Update2: There are Hive security settings:
hadoop.security.authentication=simple hive.server2.authentication=NONE hadoop.security.group.mapping=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback hadoop.user.group.static.mapping.overrides=dr.who=;