2

I am trying to write a Seq[(String, Double)] data from Spark to Cassandra DB, e.g., Seq(("re", 1.0), ("im", 2.0)) to Cassandra. But there is an exception as follows:

Exception in thread "main" scala.ScalaReflectionException: <none> is not a term
    at scala.reflect.api.Symbols$SymbolApi$class.asTerm(Symbols.scala:199)
    at scala.reflect.internal.Symbols$SymbolContextApiImpl.asTerm(Symbols.scala:84)
.....

The Spark code is as follows:

def main(args: Array[String]) {

   //  omit some code

    val rawTRLStream = KafkaUtils.createDirectStream[String, Array[Byte], StringDecoder, DefaultDecoder](ssc, kafkaParams, topics)

    val parsedTRLStream = rawTRLStream.map {

        case (_, inputStreamData) => 

           //--- do somthing next 
           //....


          val seq : Seq[(String, Double)]= Seq (("re", 1.0), ("im", 2.0))  
          seq
    }

    implicit val rowWriter = SqlRowWriter.Factory   // This is a suggestion on the web, but it does not help on this problem.

    parsedTRLStream.saveToCassandra("simple_avro_data", "simple_avro_data")

    //Kick off
    ssc.start()

    ssc.awaitTermination()

    ssc.stop()
}

The Cassandra schema is as follows:

CREATE TABLE simple_avro_data (
   re double,
   im double,
   PRIMARY KEY ((re), im)
) WITH CLUSTERING ORDER BY (im DESC);

I also try next suggestion from scala.ScalaReflectionException: <none> is not a term

val seq = (("re", 1.0), ("im", 2.0))          

This removes the exception ".... is not a term", but it introduces another exception:

Com.datastax.spark.connector.types.TypeConversionException: Cannot convert object (re,1.0) of type class scala.Tuple2 to java.lang.Double.

Does anyone know how to solve the problem?

Thanks,

1 Answers1

0

Ensure you are setting default values if your expected values are missing or null.

For example: We can see the problem a little better if we use SomeColumns and if we look at code that works and then code that will always throw an exception with bad input data.

The following code works safely by setting the request with data or else with error codes for 3 columns.

val lines = ssc.socketTextStream(host , port)
// create requests from socket stream
val requests = lines.map(x => {
  val matcher:Matcher = pattern.matcher(x) // using a regex matcher on the values
  if (matcher.matches()) {    // have matches
    val ip = matcher.group(1)
    val request = matcher.group(5)
    val status = matcher.group(6).toInt   
    (ip, request, status)  // create the request
  } else {
    ("error", "error", 0)  // no matches then create an error requests
  }
})

requests.foreachRDD((rdd, time) => {
  rdd.cache()
  rdd.saveToCassandra(keyspace, table, SomeColumns("ip", "request", "status"))   // save what we put into the request 
})

When I added a column to my requests, the database, and the saveToCassandra method but I did not add a column to the else...

I had the exception: "scala.ScalaReflectionException: is not a term" when data in my stream wasn't what I expected and it created the else request

val lines = ssc.socketTextStream(host , port)
// create requests from socket stream
val requests = lines.map(x => {
  val matcher:Matcher = pattern.matcher(x) // using a regex matcher on the values
  if (matcher.matches()) {    // have matches
    val ip = matcher.group(1)
    val request = matcher.group(5)
    val status = matcher.group(6).toInt
    (ip, request, status, agent)  // create the request + ADDITIONAL agent column
  } else {
    ("error", "error", 0)  // Will get an exception if this is created without 4 columns
  }
})

requests.foreachRDD((rdd, time) => {
  rdd.cache()
  rdd.saveToCassandra(keyspace, table, SomeColumns("ip", "request", "status", "agent"))   // save what we put into the request
})

I needed to add a default value for the new column agent to ensure I was always sending data for 4 columns

else {
    ("error", "error", 0 , "error") 

Ensure your parsedTRLStream is always populated when mapping the stream. The rdd will complain if its trying to save something from nothing.

I hope this helps explain the exception a little better.

rlit
  • 1