0

I am trying to run an example of SparkOnHbase as mentioned here -> Spark On Hbase

But i am just trying to compile and run the code on my local windows machine. My build.sbt snippet below

scalaVersion := "2.11.8"

libraryDependencies +="org.apache.spark" % "spark-core_2.11" % "2.1.0"
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.3.1"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.3.1"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.1"

And as mentioned in the Link i am just trying to put this code :

val hbaseContext = new HBaseContext(sc, config)
rdd.hbaseBulkDelete(hbaseContext,
                  tableName,
                  putRecord => new Delete(putRecord),
                  4)

But i am not even getting HbaseContext. I don't know which package to import

AJm
  • 993
  • 2
  • 20
  • 39

1 Answers1

0

I use Maven, but dependencies should be the same. I assume you are missing this dependency:

<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>1.2.0-cdh5.7.0</version>
</dependency>
gorros
  • 1,411
  • 1
  • 18
  • 29