0

I want to work with the Kafka integration for Spark streaming. I use Spark version 2.0.0.

But I get a unresolved dependency error ("unresolved dependency: org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found").

How can I accesss this package? Or am I doing something wrong/missing?

My build.sbt file:

name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.spark" %% "spark-streaming" % sparkVersion,
    "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"

Thank you for you help.

Ruslan Ostafiichuk
  • 4,422
  • 6
  • 30
  • 35
ngi
  • 146
  • 1
  • 8

1 Answers1

2

https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11

libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"
Jeffrey Chung
  • 19,319
  • 8
  • 34
  • 54
Michel Lemay
  • 2,054
  • 2
  • 17
  • 34