3

I am trying to connect to a hive instance installed in cloud instance using Apache beam-dataflow. When I run this, I am getting the below exception. This is happening when I access this db using Apache beam. I have seen many related questions which is not about apache beam or google dataflow.

(c9ec8fdbe9d1719a): java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$8CR0LcYI.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:65)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:47)
at com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
at com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:365)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
        Caused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Method not supported)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2294)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn.setup(JdbcIO.java:377)
Caused by: java.sql.SQLException: Method not supported
at org.apache.hive.jdbc.HiveConnection.isValid(HiveConnection.java:898)
at org.apache.commons.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:918)
at org.apache.commons.dbcp2.PoolableConnection.validate(PoolableConnection.java:283)
at org.apache.commons.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:357)
at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2307)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2290)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2039)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1533)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn.setup(JdbcIO.java:377)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$8CR0LcYI.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:65)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:47)
at com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
at com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:365)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

With the same connection string and Driver files, I can connect to this instance using a normal java-jdbc program.

Bugging this for a while now, and I am unable to find a solution for it. Can anyone please give any idea on this?

Please see the code snippet connecting to hive below :

PCollection<Customer> collection = dataflowPipeline.apply(JdbcIO.<Customer>read()
            .withDataSourceConfiguration(JdbcIO.DataSourceConfiguration
                    .create("org.apache.hive.jdbc.HiveDriver", "jdbc:hive2://<external IP of computer instance>:10000/dbtest")
                    .withUsername("username").withPassword("password"))     
            .withQuery(
                    "select c_customer_id,c_first_name,c_last_name,c_preferred_cust_flag,c_birth_day,from dbtest.customer")     
            .withRowMapper(new JdbcIO.RowMapper<Customer>() {
                @Override
                public Customer mapRow(ResultSet resultSet) throws Exception {
                    // TODO Auto-generated method stub
                    Customer customer = new Customer();
                    customer.setC_customer_id(resultSet.getString("c_customer_id"));
                    customer.setC_first_name(resultSet.getString("c_first_name"));
                    customer.setC_last_name(resultSet.getString("c_last_name"));
                    customer.setC_preferred_cust_flag(resultSet.getString("c_preferred_cust_flag"));
                    customer.setC_birth_day(resultSet.getInt("c_birth_day"));
                    return customer;
                }
            }).withCoder(AvroCoder.of(Customer.class)));
Balu
  • 456
  • 8
  • 19

1 Answers1

2

Apache DBCP BasicDataSource uses the method isValid to validate a connection, which was not implemented by an old version of the Hive JDBC driver - see JDBC to hive connection fails on invalid operation isValid() .

However, the method is implemented in versions after Hive 2.1.0. https://github.com/apache/hive/commit/2d2ab0942482a6ce1523dd9dd0f4094865e93b28 .

Can you use a newer version of the Hive JDBC driver?

jkff
  • 17,623
  • 5
  • 53
  • 85
  • hi @jkff . I forgot to update the question here. After seeing this issue, i have done a few analysis and tried to use 2.1.1 version of hive-jdbc. Now, I am getting this issue - util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://:3306/db/: java.net.ConnectException: Connection timed out) at dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289). Looks like its not getting the connection. – Balu Jul 12 '17 at 06:48
  • This looks like a regular network issue rather than a Dataflow issue. Are you able to connect to the same database using the same parameters from a regular Java program rather than a Dataflow pipeline? – jkff Jul 12 '17 at 15:51
  • Yes. i could successfully pull the data using normal java program. I wonder, how many threads could be created by dataflow when it start reading from db. Connectionpooling was not enabled initially as derby was configured for hive. Later we modified it to mysql. But, with a less config vm with cloudera installed was really slow and was not able to accept the pooling because of less resourcing. Do you think, that my assumption is right? – Balu Jul 14 '17 at 15:04
  • Do you have a dataflow job id I can look at? – jkff Jul 14 '17 at 15:12
  • how would you look at the job which is hosted in my cloud instance? should i add you to the sample project? I couldn't complete this poc task because of this issues. let me know how I should share that job id. @jkff – Balu Jul 17 '17 at 09:09
  • I'm an engineer on the Dataflow team so I can read internal Dataflow service logs associated with your job and can have limited read-only access to parts of the Cloud Console of your project for specific purpose of debugging a particular Dataflow job on your request. – jkff Jul 17 '17 at 23:10
  • 2017-07-09_23_59_23-12283830328272607211 : let me know if you require any other details. – Balu Jul 20 '17 at 09:16
  • It looks like your workers are repeatedly getting "Invalid status 74" protocol errors from Thrift when trying to connect to the specified IP address. I don't know what this means and I was unable to figure it out :-| It is definitely not a firewall issue, as Thrift is able to connect to Hive and receive some data, just that data is invalid. But it is also not Dataflow-specific. Sorry, I'm all out of ideas - I'd suggest to ask another question on SO quoting your Hive/Thrift error message, and asking Hive/Thrift experts. – jkff Jul 20 '17 at 15:59
  • Thanks Eugene for helping out. It was an issue with the cluster. Another team is handling that now. – Balu Aug 04 '17 at 18:55