I have generated the protobuf code using the compiler 2.4. And I have created my application jar. This works fine and I can successfully run my spark job. The jars in classpath of spark relevant to protobuf are
mesos-0.18.1-shaded-protobuf.jar,
protobuf-java-2.5.0-spark.jar
But when I use the same generated code in my sbt unit testing it is failing with
[info] org.apache.spark.SparkException: Job aborted due to stage
failure: Task 0 in stage 2.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 2.0 (TID 2, localhost):
java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses.
[info] at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
From google search I found that I need to change the protoc compiler to version 2.5 and Now My unit testing works fine . But my application is not able to run in spark . The exception I get is
java.lang.VerifyError: class xxx.xxx.xx..
overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
The machine where my application runs and my sbt test runs are different
Class path in sbt contains
protobuf-java-2.5.0.jar
protobuf-java-2.5.0-spark.jar
I had a look in pom file mesos-0.18.1-shaded-protobuf.jar it shows it contains google-protobuf version 2.5 .
The questions are what is the problem in the environment that is causing this issue ?