-1

I'm trying to use SPEL expressions along with a Spark udf (scala). SPEL expressions can be simple math operations or custom functions along with java.lang.Math class. When i run it locally everything works find, but when i build a fat jar and deploy it on databricks, i'm getting the following exception

SparkException: Job aborted due to stage failure: Task 0 in stage 159.0 failed 4 times, most recent failure: Lost task 0.3 in stage 159.0 (TID 1290, 172.16.9.7, executor 2): java.lang.NoSuchMethodError: org.springframework.core.convert.support.DefaultConversionService.getSharedInstance()Lorg/springframework/core/convert/ConversionService;

I get this exception when the SPEL contains a custom function or java.lang.Math class, but it works then there are simple Math operations like addition, sub, mul etc.

Searching online indicates that this is caused due to version incompatibilities in spring jars, but 1. im only using spring-expression and 2. i dont see any version incompatibilities.

This is my UDF:

 def calculateSpelExpression: (Seq[Double], String, mutable.WrappedArray[String]) => Double = (values: Seq[Double], expression: String, variableNames: mutable.WrappedArray[String]) => {
    val context = new StandardEvaluationContext()
    val parser = new SpelExpressionParser()

    variableNames.zipWithIndex.foreach { iter =>
      val name = iter._1.tail // use .tail in order to remove the '# ' symbol
      val value = values(iter._2)
      context.setVariable(name, value)
    }
    val value = parser.parseExpression(expression).getValue(context)
    if (value.isInstanceOf[Int]) value.asInstanceOf[Int].toDouble
    else value.asInstanceOf[Double]
  }

and my pom.xml:


    <dependencies>
        <dependency>
            <groupId>com.typesafe</groupId>
            <artifactId>config</artifactId>
            <version>1.4.0</version>
        </dependency>

        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-expression</artifactId>
            <version>5.2.0.RELEASE</version>
        </dependency>

        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>

        <dependency>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest_${scala.compat.version}</artifactId>
            <version>3.0.8</version>
            <scope>test</scope>
        </dependency>

 <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-math3</artifactId>
            <version>3.2</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>


        <dependency>
            <groupId>io.delta</groupId>
            <artifactId>delta-core_2.11</artifactId>
            <version>0.4.0</version>
<!--            <scope>provided</scope>-->
        </dependency>

        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>42.2.10</version>
        </dependency>

        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>${spark.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>io.netty</groupId>
                    <artifactId>netty-all</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
    </dependencies>
naski
  • 31
  • 5

1 Answers1

1

It appears that you somehow have an old version of the spring-core jar on the classpath.

NoSuchMethodError: org.springframework.core.convert.support.DefaultConversionService.getSharedInstance()

That method was added in 4.3.5.

/**
 * Return a shared default {@code ConversionService} instance,
 * lazily building it once needed.
 * <p><b>NOTE:</b> We highly recommend constructing individual
 * {@code ConversionService} instances for customization purposes.
 * This accessor is only meant as a fallback for code paths which
 * need simple type coercion but cannot access a longer-lived
 * {@code ConversionService} instance any other way.
 * @return the shared {@code ConversionService} instance (never {@code null})
 * @since 4.3.5
 */
public static ConversionService getSharedInstance() {
Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • Thanks for your response Gary.. I tought it's weird that it works for simple operations, but not for custom classes. Shouldn't the depedency issue affect simple operations as well? Do you have any idea how can i work this out? Since i'm only using the **spel-expressions** jar, i have no idea how is this possible. – naski Feb 22 '20 at 18:02
  • Maybe the simple expressions don't involve conversion. Passing -verbose to the jvm will tell you where classes are loaded from. – Gary Russell Feb 22 '20 at 21:19