0

I'm trying to add Hive to my Hadoop 3.2.0 ecosystem. I have followed the install&config steps as described here: https://www.tutorialspoint.com/hive/hive_installation.htm

Unfortunately I have stuck while trying to test hive using
hive -hiveconf hive.root.logger=DEBUG,console

The following error is reported. I thing something is wrong with derby jdbc driver. I'll appreciate any suggestions or clues how to fix it, please. My CLASSPATH is: env | grep CLASSPATH CLASSPATH=:/opt/derby/lib/derby.jar:/opt/derby/lib/derbytools.jar:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/home/hadoop/hadoop/lib/*:.:/opt/hive/lib/*:.:/opt/derby/lib/derby.jar:/opt/derby/lib/derbytools.jar

And then:

hive -hiveconf hive.root.logger=DEBUG,console
/usr/bin/which: no hbase in (/export/viya/python/bin:/export/viya/R/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/derby/bin:/home/hadoop/hadoop/sbin:/home/hadoop/hadoop/bin:/opt/hive/bin:/opt/spark/bin:/opt/spark/sbin:/home/nfsuser/.local/bin:/home/nfsuser/bin:/opt/kustomize:/home/nfsuser/sas-viya:/opt/spark/bin:/opt/spark/sbin:/opt/spark/bin:/opt/spark/sbin:/export/viya/python/bin:/opt/hive/bin:/home/hadoop/hadoop/sbin:/home/hadoop/hadoop/bin:/opt/hive/bin:/opt/derby/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in \[jar:file:/opt/hive/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: Found binding in \[jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type \[org.apache.logging.slf4j.Log4jLoggerFactory\]
Hive Session ID = 9a08bcc0-5ff9-458d-88bd-2488702ac1cb
2023-02-25T03:32:26,599  INFO \[main\] SessionState: Hive Session ID = 9a08bcc0-5ff9-458d-88bd-2488702ac1cb
...
2023-02-25T03:32:27,889 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionDriverName value null from jpox.properties with org.apache.derby.jdbc.EmbeddedDriver
...
2023-02-25T03:32:27,889 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] metastore.ObjectStore: Overriding datanucleus.connectionPool.maxPoolSize value null from jpox.properties with 10
...
2023-02-25T03:32:27,917 DEBUG \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] datasource.HikariCPDataSourceProvider: Configuration requested hikaricp pooling, HikariCpDSProvider exiting
2023-02-25T03:32:28,206  INFO \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] hikari.HikariDataSource: HikariPool-1 - Starting...
2023-02-25T03:32:28,210  WARN \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] util.DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
2023-02-25T03:32:28,210 ERROR \[9a08bcc0-5ff9-458d-88bd-2488702ac1cb main\] DataNucleus.Datastore: Exception thrown creating StoreManager. See the nested exception
org.datanucleus.exceptions.NucleusException: Error creating transactional connection factory
at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:214) \~\[datanucleus-core-4.1.17.jar:?\]
at org.datanucleus.store.AbstractStoreManager.\<init\>(AbstractStoreManager.java:162) \~\[datanucleus-core-4.1.17.jar:?\]
...
... 75 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "HikariCP" plugin to create a ConnectionPool gave an error : Driver org.apache.derby.jdbc.EmbeddedDriver claims to not accept jdbcUrl, jdbc:derby://zbwv4demo1-nfs-vm:1527/metastore_db?create=true
Ziggy
  • 19
  • 4
  • In my hive-site.xml I had initially: jdbc:derby://localhost:1527/metastore_db;create = true I have found "somewhere" the following entry: jdbc:derby:metastore_db;create = true and .... it works. Which is good news, but I'm still curious why removing hostname:port solved the issue. In all tutorials JDBC URL contains hostname:port? – Ziggy Feb 27 '23 at 05:28

0 Answers0