0

I have been trying to use the elastic4s in my spark application but every time it tries to send data to my elasticsearch node I keep getting:

java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
        at org.elasticsearch.threadpool.ThreadPool.<clinit>(ThreadPool.java:190)
        at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)
        at com.sksamuel.elastic4s.ElasticClient$.transport(ElasticClient.scala:111)
        at com.sksamuel.elastic4s.ElasticClient$.remote(ElasticClient.scala:92)

Not sure where I can even start to debug this error. Code is fairly simple:

val elasticAddress = getEnvirometalParameter("streaming_pipeline", "elastic_address")(0)._1
  val uri = ElasticsearchClientUri("elasticsearch://" + elasticAddress)
  val client = ElasticClient.remote(uri)

  def elasticInsert(subject:String, predicate:String, obj:String, label:String) = {
    client.execute {
      update id (label + subject + predicate + obj) in "test" / "quad"  docAsUpsert (
        "subject" -> subject,
        "predicate" -> predicate,
        "object" -> obj,
        "label" -> label
        )
    }
  }
Cœur
  • 37,241
  • 25
  • 195
  • 267
theMadKing
  • 2,064
  • 7
  • 32
  • 59
  • 1
    Looks to be: "Guava issue. Something else has a newer version of Guava but es wants 18." – theMadKing Sep 13 '16 at 18:12
  • Elasticsearch and Spark don't seem to play nicely due to Guava. In Elasticsearch 5.0 they have removed the dependency on Guava which is very nice. – sksamuel Nov 07 '16 at 00:42

1 Answers1

0

The issue is that Elasticsearch and Spark clash on their versions of Netty (and other dependencies). The versions are incompatible and you get these types of exceptions at runtime.

Since version 5.3 of Elastic4s, your best bet is to use the HttpClient which has no dependencies on things like Netty or Guava.

sksamuel
  • 16,154
  • 8
  • 60
  • 108