I am trying to write a Seq[(String, Double)] data from Spark to Cassandra DB, e.g., Seq(("re", 1.0), ("im", 2.0)) to Cassandra. But there is an exception as follows:
Exception in thread "main" scala.ScalaReflectionException: <none> is not a term
at scala.reflect.api.Symbols$SymbolApi$class.asTerm(Symbols.scala:199)
at scala.reflect.internal.Symbols$SymbolContextApiImpl.asTerm(Symbols.scala:84)
.....
The Spark code is as follows:
def main(args: Array[String]) {
// omit some code
val rawTRLStream = KafkaUtils.createDirectStream[String, Array[Byte], StringDecoder, DefaultDecoder](ssc, kafkaParams, topics)
val parsedTRLStream = rawTRLStream.map {
case (_, inputStreamData) =>
//--- do somthing next
//....
val seq : Seq[(String, Double)]= Seq (("re", 1.0), ("im", 2.0))
seq
}
implicit val rowWriter = SqlRowWriter.Factory // This is a suggestion on the web, but it does not help on this problem.
parsedTRLStream.saveToCassandra("simple_avro_data", "simple_avro_data")
//Kick off
ssc.start()
ssc.awaitTermination()
ssc.stop()
}
The Cassandra schema is as follows:
CREATE TABLE simple_avro_data (
re double,
im double,
PRIMARY KEY ((re), im)
) WITH CLUSTERING ORDER BY (im DESC);
I also try next suggestion from scala.ScalaReflectionException: <none> is not a term
val seq = (("re", 1.0), ("im", 2.0))
This removes the exception ".... is not a term", but it introduces another exception:
Com.datastax.spark.connector.types.TypeConversionException: Cannot convert object (re,1.0) of type class scala.Tuple2 to java.lang.Double.
Does anyone know how to solve the problem?
Thanks,