HI Coders, I'm back again. I'm trying to create a hive table from a dataframe using HIve context in my scala code, im able to do it in sqlContext but when it comes to HiveContext, it is throwing this error
[error] /home/mapr/avroProject/src/main/scala/AvroConsumer.scala:75: object HiveContext in package hive cannot be accessed in package org.apa che.spark.sql.hive
[error] HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc());
I've tried the same with slightly different declarations as well,
val hiveContext = org.apache.spark.sql.hive.HiveContext(sc)
I have added the sbt library dependenccies too,
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.1"
i tried with "provided" also.
Here is my piece of code
messages.foreachRDD(rdd=>
{
import org.apache.spark.sql.hive.HiveContext
HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc());
//import org.apache.spark.sql.hive._
//val dataframe = sqlContext.read.json(rdd.map(_._2))
val dataframe =sqlContext.read.json(rdd.map(_._2))
val df =dataframe.toDF()
Any fix on this? I've never came across this "not accessible" error.
And also i tried to create a temptable from the code
val dataframe =sqlContext.read.json(rdd.map(_._2))
val df =dataframe.toDF()
df.registerTempTable("mdl_events")
But where could i find the mdl_events table? is there any default database in spark where i can look for this? I cannot fond from spark shell though.