0

My application uses K8s cronjob to schedule the application run, consequently creating a pod for each occurrence. In most of the cases, the application runs well, but in some of them, it fails with the following error:

  java.lang.ExceptionInInitializerError
  2022-12-23T10:45:32.899555393Z    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  2022-12-23T10:45:32.899572625Z    at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  2022-12-23T10:45:32.899576309Z    at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  2022-12-23T10:45:32.899590250Z    at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
  2022-12-23T10:45:32.899601166Z    at java.base/java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:600)
  2022-12-23T10:45:32.899604720Z    at java.base/java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:678)
  2022-12-23T10:45:32.899608169Z    at java.base/java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:737)
  2022-12-23T10:45:32.899610934Z    at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateParallel(ForEachOps.java:159)
  2022-12-23T10:45:32.899613557Z    at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateParallel(ForEachOps.java:173)
  2022-12-23T10:45:32.899629628Z    at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
  2022-12-23T10:45:32.899633296Z    at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
  2022-12-23T10:45:32.899637988Z    at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:661)
  [...]
  2022-12-23T10:45:32.899959490Z Caused by: java.lang.ExceptionInInitializerError
  2022-12-23T10:45:32.899972035Z    at org.apache.spark.package$.<init>(package.scala:93)
  2022-12-23T10:45:32.899985422Z    at org.apache.spark.package$.<clinit>(package.scala)
  2022-12-23T10:45:32.899990812Z    at org.apache.spark.SparkContext.$anonfun$new$1(SparkContext.scala:183)
  2022-12-23T10:45:32.899996770Z    at org.apache.spark.internal.Logging.logInfo(Logging.scala:54)
  2022-12-23T10:45:32.899998834Z    at org.apache.spark.internal.Logging.logInfo$(Logging.scala:53)
  2022-12-23T10:45:32.900029297Z    at org.apache.spark.SparkContext.logInfo(SparkContext.scala:73)
  2022-12-23T10:45:32.900031985Z    at org.apache.spark.SparkContext.<init>(SparkContext.scala:183)
  2022-12-23T10:45:32.900033999Z    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2526)
  2022-12-23T10:45:32.900049879Z    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
  2022-12-23T10:45:32.900055477Z    at scala.Option.getOrElse(Option.scala:189)
  2022-12-23T10:45:32.900061375Z    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
  [...]
  2022-12-23T10:45:32.900111374Z    at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
  2022-12-23T10:45:32.900134359Z    at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
  2022-12-23T10:45:32.900137775Z    at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
  2022-12-23T10:45:32.900145849Z    at java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
  2022-12-23T10:45:32.900159046Z    at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
  2022-12-23T10:45:32.900178955Z    at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
  2022-12-23T10:45:32.900183168Z    at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
  2022-12-23T10:45:32.900206355Z    at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
  2022-12-23T10:45:32.900214981Z    at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
  2022-12-23T10:45:32.900227137Z    at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
  2022-12-23T10:45:32.900377715Z Caused by: org.apache.spark.SparkException: Could not find spark-version-info.properties
  2022-12-23T10:45:32.900381131Z    at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:62)
  2022-12-23T10:45:32.900396317Z    at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
  2022-12-23T10:45:32.900399396Z    ... 25 more

How spark sesssion is being build

lazy val spark: SparkSession = SparkSession.builder
.appName("My application")
.master("local") //Error occur here
.getOrCreate()

Spark version: 2.4.8

Scala version: 2.12

Anyone already had a similar problem?

Julia Bel
  • 337
  • 4
  • 18
  • is the code getting build every time for execution or is it using a prebuilt image at run time – Anand Satheesh Dec 23 '22 at 13:56
  • @AnandSatheesh prebuilt image – Julia Bel Dec 23 '22 at 14:58
  • 2
    Can you check [this question?](https://stackoverflow.com/questions/42751816/spark-version-info-properties-not-found-in-jenkins) Seems you need to add this file somewhere. Did you remove some properties files while merging libs? – Mikhail Ionkin Dec 23 '22 at 20:01
  • 1
    could you try building the code after adding ```spark-version-info.properties``` file with the appropriate versions of spark mentioned in it, while creating this folder structure in your build path ```./core/target/extra-resources `` – Anand Satheesh Dec 24 '22 at 02:09

0 Answers0