0

We are running spark streaming to get the feed from Kafka. Now we are trying to use Phoenix JDBC to extract some data from HBASE. When I run the code in local its running fine without any issues, but when I run it using yarn-cluster it throws below exception.

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:190)

Below is the code snippet:

def check(arg0:String,arg1:String)
{
  val query = "query"

        Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
    var conn =  DriverManager.getConnection("jdbc:phoenix:IP:zkport:/znode");
        val statement = conn.createStatement()
        val resultSet = statement.executeQuery(query)
        var out="";
        while(resultSet.next())
        {
          out=resultSet.getString("..")
        }
}

And the SBT dependency added is

libraryDependencies += "org.apache.phoenix" % "phoenix-core" % "4.5.1-HBase-1.0"

I manually checked for the missing class, and it is there in phoenix-core jar. What is the reason behind yarn/spark throwing an exception. The same issue is reported at Apache Phoenix (4.3.1 and 4.4.0-HBase-0.98) on Spark 1.3.1 ClassNotFoundException but I tried adding it as a separate classpath that doesn't work either. Could someone help me in resolving this

Community
  • 1
  • 1
Vignesh I
  • 2,211
  • 2
  • 20
  • 40

1 Answers1

0

Adding phoenix-core to classpath.txt worked. But a weird issue.

Vignesh I
  • 2,211
  • 2
  • 20
  • 40