I am attempting to produce a Spark Dataframe from within Spark, which has been initialised using apache Livy.
I first noticed this issue on this more complicated hbase call:
import spark.implicits._
...
spark.sparkContext
.newAPIHadoopRDD(
conf,
classOf[TableInputFormat],
classOf[ImmutableBytesWritable],
classOf[Result]
)
.toDF()
But i found i could get the same thing to occur on a simple:
import spark.implicits._
...
val filtersDf = filters.toDF()
Where, filtersDf
is just a sequence of case classes.
The common issue is the *.toDF()
, however it also occurs with *.toDS()
, which makes me think that the implicit resolution on import spark.implicits._
is not working. The underlying objects to be converted to dataframes do have data.
The error stack looks like it relates to runtime implicit resolution using scala runtime reflection.
Note that i have checked and both spark and the compiled code both use the same version of Scala (2.11) .
The exception i get is:
java.lang.RuntimeException: java.util.NoSuchElementException: head of empty list
scala.collection.immutable.Nil$.head(List.scala:420)
scala.collection.immutable.Nil$.head(List.scala:417)
scala.collection.immutable.List.map(List.scala:277)
scala.reflect.internal.Symbols$Symbol.parentSymbols(Symbols.scala:2117)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:301)
scala.reflect.internal.SymbolTable.openPackageModule(SymbolTable.scala:341)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply$mcV$sp(SymbolLoaders.scala:74)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.runtime.SymbolLoaders$LazyPackageType$$anonfun$complete$2.apply(SymbolLoaders.scala:71)
scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
scala.reflect.runtime.SymbolLoaders$LazyPackageType.complete(SymbolLoaders.scala:71)
scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:174)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$1.info(SynchronizedSymbols.scala:174)
scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
scala.reflect.internal.Types$Type.member(Types.scala:600)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
scala.reflect.internal.Mirrors$RootsBase.staticPackage(Mirrors.scala:204)
scala.reflect.runtime.JavaMirrors$JavaMirror.staticPackage(JavaMirrors.scala:82)
scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
My working assumption is that I am missing a dependency or import and this is some kind of scala-ism.
I have yet to find any other references to this issue. Ultimately i think it is probably down to imports/dependencies but so far I can't quite see what it is. Any help greatly appreciated. I'm keen to know ways to fix the issue or alternatively to create data frames via less magical approaches than toDf()
.
Spark info:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.2.0-mapr-1901
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)