0

I am using scala 2.13 , spark 3.3.0 and mssql latest spark connector that is "com.microsoft.azure" % "spark-mssql-connector_2.12" % "1.3.0-BETA" here I am inserting data into mssql

df.write
      .format("com.microsoft.sqlserver.jdbc.spark")
      .mode(SaveMode.Append)
      .option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
      .option("url", jdbcUrl)
      .option("dbtable", "mytable")
      .option("tableLock", value = false)
      .option("schemaCheckEnabled", value = false)
      .save()

getting exception

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.Map.$plus(Lscala/Tuple2;)Lscala/collection/immutable/Map;
    at com.microsoft.sqlserver.jdbc.spark.DefaultSource.createRelation(DefaultSource.scala:55)
    at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)

same code works if I reduce scala version from 2.13 to 2.12 , currently 2.13 spark-mssql connector is not available , any workaround to get rid off this error using scala 2.13 version

Shalaj
  • 579
  • 8
  • 19

2 Answers2

1

The short answer is that, in Scala 2, you can't use dependencies that weren't compiled for the same scala minor version. They are not binary compatible

Binary Compatibility

In Scala 2 different minor versions of the compiler were free to change the way how they encode different language features in JVM bytecode so each bump of the compiler's minor version resulted in breaking binary compatibility and if a project had any Scala dependencies they all needed to be (cross-)compiled to the same minor Scala version that was used in that project itself. On the contrary, Scala 3 has a stable encoding into JVM bytecode.

As you can see in the artifactory repository of spark-mssql-connector, today there is no release published that is compatible with scala 2.13.

As you can see in the github repo of sql-spark-connector, there is an open issue titled Need support for scala version 2.13 opened on May 2, 2023 with no activity yet.

The Support section in the README of the project says:

Support

The Apache Spark Connector for Azure SQL and SQL Server is an open source project. This connector does not come with any Microsoft support. For issues with or questions about the connector, please create an Issue in this project repository. The connector community is active and monitoring submissions.

If it's mandatory to the connector compatible with scala 2.13, you can do the upgrade by yourself. You can wait until the upgrade is done, but you don't have any guarantee of when that will happen.

Gastón Schabas
  • 2,153
  • 1
  • 10
  • 17
  • I did upgrade by myself now it is working , thanks – Shalaj Aug 03 '23 at 18:28
  • @Shalaj that's awesome. Did you modify the project to build and publish your project against multiple versions of Scala ([Cross Building with maven](https://github.com/davidB/scala-maven-plugin/wiki/Frequently-Asked-Questions#cross-building-with-maven))? In that case it would be nice that you create a PR with what you did. If you just change some properties in the `pom.xml` as a workaround, please post a comment in the issue detailing what you did for that. If you consider that my answer was enough to solve your question, feel free to mark it as the solution – Gastón Schabas Aug 03 '23 at 19:42
  • I just updated pom.xml , no code change – Shalaj Aug 04 '23 at 04:30
  • Sounds good. Could you post the steps you followed to produce the new version compatible with scala 2.13 in the [issue reported](https://github.com/microsoft/sql-spark-connector/issues/225)? What lines you changed in the pom, which values you set, the command executed to build the new version, etc. It will help to the ones that have the same issue with you in a near future. You can also mention this post if you want, but is not required – Gastón Schabas Aug 04 '23 at 04:48
  • It seems I dont have permission to create branch and raise PR , anyway I added my comment on -https://github.com/microsoft/sql-spark-connector/issues/225, earlier I was not able to insert record through jdbc as mentioned in my question, after updating pom.xml it worked fine – Shalaj Aug 04 '23 at 05:22
  • you can't create a PR if you are not part of the members of the project. Also you need permissions for that. You can [fork it and submit a PR](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request-from-a-fork). What you did is not exactly a solution, just a workaround. Doing that, you will force anyone to upgrade scala 2.13 if they wan to upgrade the connector. – Gastón Schabas Aug 04 '23 at 05:45
  • it will be a new release for scala 2.13 , ideally we have single release for 2.12 and 2.13 both , like we have in spark-core , it always give us option to select between 2.12 and 2.13 – Shalaj Aug 04 '23 at 05:53
0

You can't have a different Scala version than your Spark runtime - they must be the same major (and sometimes minor if there are bugs).

Is there a reason why using the 2.12, which works, isn't enough?

Chris
  • 1,240
  • 7
  • 8
  • we are moving to 2.13 Version, spark runtime has 2.13 version only but mssql connector doesnt have 2.13 dependency available – Shalaj Aug 03 '23 at 15:45
  • What's driving the change to move to 2.13 if you don't mind me asking, it seems you are creating an issue where there doesn't need to be one. – Chris Aug 03 '23 at 16:29
  • We have to deploy our code in another cluster which has spark 3.3.0 and scala 2.13 version , so we are going to create a new repo with all latest version – Shalaj Aug 03 '23 at 16:33
  • ah, and you presumably have no control over that cluster. Well artefact wise you'll have to build https://github.com/microsoft/sql-spark-connector yourself it seems, not a great answer but there is no other way. – Chris Aug 03 '23 at 17:16