I am using Spark 1.6.
I am trying to connect to a table in my spark-sql java code by :
JavaSparkContext js = new JavaSparkContext();
SQLContext sc = new SQLContext(js);
DataFrame mainFile = sc.sql("Select * from db.table");
It gives me a table not found exception.
But when I do that in spark-shell using scala, it works fine.The table gets accessed and I can print out the data also.
Any inputs on this issue?