0

Created a spark structured streaming application using spring boot. The bootRun works fine, but when deploying the same jar using "spark-submit", gives me the following error :

org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'defaultValidator' defined in class path resource [org/springframework/boot/autoconfigure/validation/ValidationAutoConfiguration.class]: Invocation of init method failed; nested exception is java.lang.NoSuchMethodError: javax.validation.BootstrapConfiguration.getClockProviderClassName()Ljava/lang/String;

Following is my dependencies list from build.gradle

compile("org.springframework.boot:spring-boot-starter-security")
compile('org.apache.kafka:kafka-streams')
compileOnly('org.projectlombok:lombok:1.18.2')
testCompile('org.springframework.boot:spring-boot-starter-test')
testCompile('org.springframework.security:spring-security-test')
compile('org.apache.spark:spark-sql_2.11:2.3.1')
compile('org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.1')
compile('org.apache.spark:spark-streaming_2.11:2.3.1')
compile('org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1')
compile group: 'com.google.protobuf', name: 'protobuf-java', version: '3.6.1'
compile('org.springframework.boot:spring-boot-starter-web')
compile group: 'javax.validation', name: 'validation-api', version: '2.0.1.Final'
testCompile group: 'junit', name: 'junit', version: '4.12'
compile "redis.clients:jedis:2.9.0"
compile group: 'org.eclipse', name: 'yasson', version: '1.0.1'
compile (group: 'org.glassfish', name: 'javax.json', version: '1.1.2')

Help Required! Any suggestions?

Ebrahim Pasbani
  • 9,168
  • 2
  • 23
  • 30

1 Answers1

0

Reason: Spark has earlier version of 'javax.validation:validation-api', without "getClockProviderClassName" method.

Options like "userClassPathFirst" can help: Classpath resolution between spark uber jar and spark-submit --jars when similar classes exist in both

pasha701
  • 6,831
  • 1
  • 15
  • 22
  • Thank you for the reply but it didn't work. Tried with the following configs " SparkConf() .setAppName(appName) .setMaster(masterUri) .set("spark.driver.userClassPathFirst","true") .set("spark.executor.userClassPathFirst", "true"); " Still getting the same error even when deploying in cluster mode – RichieNotSoRich Sep 23 '18 at 18:42
  • Option have to be used in spark-submit parameters. Setting in SparkConf is too late - application already started with incorrect libraries. – pasha701 Sep 23 '18 at 19:42
  • Still the same error! Following is the spark-submit command -> [[ spark-submit --deploy-mode cluster --master yarn --conf spark.driver.userClassPathFirst=true --conf spark.executor.userClassPathFirst=true mySparkJob-1.0-SNAPSHOT.jar ]] – RichieNotSoRich Sep 23 '18 at 20:24