0

I am trying to create a table in hive metastore using shark by executing the following command:

CREATE TABLE src(key int, value string);

but i always get:

FAILED: Hive Internal Error: java.util.NoSuchElementException(null)

Read about the same thing in the google group- shark-users but alas.

My spark version is 0.8.1 My shark version is 0.8.1 Hive binary version is 0.9.0

I have pre installed hive-0.10.0 from cdh4.5.0 but i cant use it since shark 0.8.1 is not compatible with hive-0.10.0 yet.

I can run various queries like select * from table_name; but not create table query. Even trying to create a cached table fails.

If i try and do sbt build using my HADOOP_VERSION=2.0.0cdh4.5.0, i get DistributedFileSystem error and i am not able to run any query.

I am dire need of a solution. Ill be glad if somebody can put me on to a right direction. I have mysql database and not derby.

Mateusz Dymczyk
  • 14,969
  • 10
  • 59
  • 94
ravihemnani
  • 177
  • 2
  • 10

1 Answers1

1

I encountered a similar problem, and it seems that this occurs only in 0.8.1 of Shark. I solved it by reverting to Spark and Shark 0.8.0, and it works fine.

0.8.0 and 0.8.1 are very similar in functionality and unless you are using Spark for the added functionality between the two releases, you would be better off staying with 0.8.0.

By the way, it's SPARK_HADOOP_VERSION and SHARK_HADOOP_VERSION if you intend to build those two from the source code. It's not just HADOOP_VERSION.

laughing_man
  • 3,756
  • 1
  • 20
  • 20
  • I was able to solve that. You are right about SPARK_HADOOP_VERSION and SHARK_HADOOP_VERSION and also i removed hadoop jars which were corresponding to older version(hadoop 1) where were causing the issue. – ravihemnani Jun 05 '14 at 11:32