0

My code compiles against scala 2.12.10 but has to run on scala 2.12.15. My code:

import import scala.tools.nsc.Settings

val settings = new Settings
settings.usejavacp.value = false

usejavacp is throwing

java.lang.NoSuchMethodError: scala.tools.nsc.Settings.usejavacp()Lscala/tools/nsc/settings/AbsSettings$AbsSetting;

because Settings is a StandardScalaSettings where the definition of the class changed like so (only including relevant API):

2.12.10:

public interface scala.tools.nsc.settings.StandardScalaSettings {
  // other APIs
  public abstract scala.tools.nsc.settings.MutableSettings$BooleanSetting usejavacp();
}

to

2.12.15:

public interface scala.tools.nsc.settings.StandardScalaSettings {
  // other APIs
  public abstract scala.tools.nsc.settings.AbsSettings$AbsSetting usejavacp();
}

Is there any way I can make this work without upgrading my dependencies? Can I use reflection?

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
  • *"My code compiles against scala 2.12.10 but has to run on scala 2.12.15"* Why do you need this? Why can't you upgrade the dependencies? – Dmytro Mitin Mar 04 '23 at 16:36

1 Answers1

1

Yeah, runtime reflection works both in Scala 2.12.10 and 2.12.15.

You can replace

settings.usejavacp.value = false

with

import scala.reflect.runtime
import scala.reflect.runtime.universe._

val rm = runtime.currentMirror
val method = typeOf[MutableSettings].member(TermName("usejavacp")).asMethod
rm.reflect(settings).reflectMethod(method)().asInstanceOf[settings.BooleanSetting].value = false

Reflection to call method that had its name changed in an upgrade?


By the way, regarding your deleted question IncompatibleClassChangeError: org.apache.spark.sql.catalyst.plans.logical.LeafNode

You can try runtime compilation. It seems to work both in Spark 3.1.2 and 3.3.0.

// Spark 3.1.2

import scala.reflect.runtime.universe._
import scala.reflect.runtime
import scala.tools.reflect.ToolBox // libraryDependencies += scalaOrganization.value % "scala-compiler" % scalaVersion.value exclude("org.scala-lang.modules", "scala-xml_2.12")

object App {
  val rm = runtime.currentMirror
  val tb = rm.mkToolBox()
  val catalyst        = q"org.apache.spark.sql.catalyst"
  val Attribute       = tq"$catalyst.expressions.Attribute"
  val PredicateHelper = tq"$catalyst.expressions.PredicateHelper"
  val LeafNode        = tq"$catalyst.plans.logical.LeafNode"
  val sym = tb.define(
    q"""
      case class MyClass() extends $LeafNode with $PredicateHelper {
        override def output: Seq[$Attribute] = Seq()
      }
    """.asInstanceOf[ClassDef]
  ).asClass
}
// Spark 3.3.0

import scala.reflect.runtime.universe._

object Main {
  def main(args: Array[String]): Unit = {
    println(App.tb.eval(q"new ${App.sym}()")) // MyClass
  }
}
Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
  • I'm trying the runtime compilation solution: ``` object Reflector { // same as App in your post val Filter = tq"$catalyst.plans.logical.Filter" val sym = tb.define( q""" case class MyClass(delegate: $Filter) extends $LeafNode with $PredicateHelper { // class details } """.asInstanceOf[ClassDef] ).asClass } ``` and I'm trying to use it like so: ``` def doWork(filter: Filter): LogicalPlan = { val result = Reflector.tb.eval(q"${Reflector.sym}($filter)").asInstanceOf[LeafNode] } ``` but – stackoverflowflowflwofjlw Mar 06 '23 at 22:48
  • I don't understand how `filter` is passed as a parameter to the `MyClass` constructor. I run into exception `Can't unquote org.apache.spark.sql.catalyst.plans.logical.Filter, consider providing an implicit instance of Liftable[org.apache.spark.sql.catalyst.plans.logical.Filter]` – stackoverflowflowflwofjlw Mar 06 '23 at 22:55
  • I tried defining an implicit `implicit val liftableFilter: universe.Liftable[Filter] = Liftable[Filter] { filter => q"_root_.org.apache.spark.sql.catalyst.plans.logical.Filter(${filter.condition}, ${filter.child})" }` but the `filter` being used to get `condition` and `child` isn't the same as the one being passed to `doWork(filter: Filter)`...how do I use that one? – stackoverflowflowflwofjlw Mar 06 '23 at 22:56
  • @stackoverflowflowflwofjlw In order to instantiate runtime-generated `MyClass` you can use either ordinary runtime reflection https://scastie.scala-lang.org/DmytroMitin/KKHN7MPfQxOb7sOzrX6DNQ/1 https://scastie.scala-lang.org/DmytroMitin/KKHN7MPfQxOb7sOzrX6DNQ/2 or runtime compilation (toolbox) once again https://scastie.scala-lang.org/DmytroMitin/KKHN7MPfQxOb7sOzrX6DNQ/3 https://scastie.scala-lang.org/DmytroMitin/KKHN7MPfQxOb7sOzrX6DNQ/4 – Dmytro Mitin Mar 07 '23 at 13:35
  • @stackoverflowflowflwofjlw You can splice (`${...}`) only trees (symbols, types) into trees (`q"..."`). You can't splice actual values (`filter` in `def doWork(filter: Filter)`). They exist at different stages, trees at compile time (the toolbox compile time inside runtime), values at runtime. – Dmytro Mitin Mar 07 '23 at 13:35
  • @stackoverflowflowflwofjlw Please don't post big amount of code in comments. It's better to undelete your deleted question or edit current or open a new one and post the code with proper indentation. – Dmytro Mitin Mar 07 '23 at 13:38
  • 1
    New question https://stackoverflow.com/questions/75666717/how-to-runtime-compile-with-reflection-for-a-class – stackoverflowflowflwofjlw Mar 07 '23 at 20:18