0

When I want to read the data from snowflake and mssql systems through using pyspark by creating df, I am getting this error

py4j.protocol.Py4JJavaError: An error occurred while calling o36.load.
: java.lang.NoClassDefFoundError: scala/$less$colon$less
        at net.snowflake.spark.snowflake.DefaultSource.shortName(DefaultSource.scala:44)
        at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$2(DataSource.scala:657)
        at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$2$adapted(DataSource.scala:657)
        at scala.collection.TraversableLike.$anonfun$filterImpl$1(TraversableLike.scala:304)
        at scala.collection.Iterator.foreach(Iterator.scala:943)
        at scala.collection.Iterator.foreach$(Iterator.scala:943)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
        at scala.collection.IterableLike.foreach(IterableLike.scala:74)
        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
        at scala.collection.TraversableLike.filterImpl(TraversableLike.scala:303)
        at scala.collection.TraversableLike.filterImpl$(TraversableLike.scala:297)
        at scala.collection.AbstractTraversable.filterImpl(Traversable.scala:108)
        at scala.collection.TraversableLike.filter(TraversableLike.scala:395)
        at scala.collection.TraversableLike.filter$(TraversableLike.scala:395)
        at scala.collection.AbstractTraversable.filter(Traversable.scala:108)
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:657)
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:725)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:207)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:171)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:282)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
        at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.ClassNotFoundException: scala.$less$colon$less
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)

Please help on this

I tried to extract data from snowflake and Mssql systems by using pyspark

Akhil
  • 1
  • 1
  • Standard Scala library seems to be not on the classpath. – Dmytro Mitin Nov 29 '22 at 04:01
  • How do you normally add dependencies? Can you try to add `"org.scala-lang" % "scala-library" % [SCALA VERSION]`? https://mvnrepository.com/artifact/org.scala-lang/scala-library – Dmytro Mitin Nov 29 '22 at 22:27

0 Answers0