1

I am using Spark 1.6.

I am trying to connect to a table in my spark-sql java code by :

JavaSparkContext js = new JavaSparkContext();
SQLContext sc = new SQLContext(js); 

DataFrame mainFile = sc.sql("Select * from db.table");

It gives me a table not found exception.

But when I do that in spark-shell using scala, it works fine.The table gets accessed and I can print out the data also.

Any inputs on this issue?

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
Aviral Kumar
  • 814
  • 1
  • 15
  • 40

1 Answers1

3

Spark-shell provides HiveContext. If you want to use HiveContext in Java code then add dependency for that in your application and then use in java program. Please refer http://spark.apache.org/docs/1.6.2/sql-programming-guide.html#hive-tables

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.6.2</version>
</dependency>
abaghel
  • 14,783
  • 2
  • 50
  • 66