This has a couple previous questions, with answers but the answers often don't have clear enough information to solve the problem.
I am using Apache Spark, to ingest data into Elasticsearch. We are using X-Pack security, and its corresponding transport client. I am using the transport client to create/delete indices in special cases, then using Spark for ingestion. When our code gets to client.close()
an exception is thrown:
Exception in thread "elasticsearch[_client_][generic][T#2]" java.lang.NoSuchMethodError: io.netty.bootstrap.Bootstrap.config()Lio/netty/bootstrap/BootstrapConfig;
at org.elasticsearch.transport.netty4.Netty4Transport.lambda$stopInternal$5(Netty4Transport.java:443)
at org.apache.lucene.util.IOUtils.close(IOUtils.java:89)
at org.elasticsearch.common.lease.Releasables.close(Releasables.java:36)
at org.elasticsearch.common.lease.Releasables.close(Releasables.java:46)
at org.elasticsearch.common.lease.Releasables.close(Releasables.java:51)
at org.elasticsearch.transport.netty4.Netty4Transport.stopInternal(Netty4Transport.java:426)
at org.elasticsearch.transport.TcpTransport.lambda$doStop$5(TcpTransport.java:959)
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:569)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
At first, I believed that the X-Pack Transport client was using the Netty coming in from Spark, so I excluded it. Even after excluding it, we run into the same issue. Here is our set of dependencies:
libraryDependencies ++= Seq(
"com.crealytics" % "spark-excel_2.11" % "0.9.1" exclude("io.netty", "netty-all"),
"com.github.alexarchambault" %% "scalacheck-shapeless_1.13" % "1.1.6" % Test,
"com.holdenkarau" % "spark-testing-base_2.11" % "2.2.0_0.7.4" % Test exclude("org.scalatest", "scalatest_2.11") ,
"com.opentable.components" % "otj-pg-embedded" % "0.9.0" % Test,
"org.apache.spark" % "spark-core_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
"org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
"org.apache.spark" % "spark-hive_2.11" % "2.2.0" % "provided" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
"org.apache.logging.log4j" % "log4j-core" %"2.8.2",
"org.elasticsearch" % "elasticsearch-spark-20_2.11" % "5.5.0" exclude("org.scalatest", "scalatest_2.11") exclude("io.netty", "netty-all"),
"org.elasticsearch.client" % "x-pack-transport" % "5.5.0",
"org.elasticsearch.client" % "transport" % "5.5.0",
"org.elasticsearch.test" % "framework" % "5.4.3" % Test,
"org.postgresql" % "postgresql" % "42.1.4",
"org.scalamock" %% "scalamock-scalatest-support" % "3.5.0" % Test,
"org.scalatest" % "scalatest_2.11" % "3.0.1" % Test,
"org.scalacheck" %% "scalacheck" % "1.13.4" % Test,
"org.scalactic" %% "scalactic" % "3.0.1",
"org.scalatest" %% "scalatest" % "3.0.1" % Test,
"mysql" % "mysql-connector-java" % "5.1.44"
)
I verified with sbt dependencyTree
that SBT is not excluding netty from Spark and spark-excel, and I'm not sure why... We're using SBT 1.0.4.
UPDATE: spark-submit/Spark was the culprit, answer below!