4

I have a Spark consumer which streams from Kafka. I am trying to manage offsets for exactly-once semantics.

However, while accessing the offset it throws the following exception:

"java.lang.ClassCastException: org.apache.spark.rdd.MapPartitionsRDD cannot be cast to org.apache.spark.streaming.kafka.HasOffsetRanges"

The part of the code that does this is as below :

var offsetRanges = Array[OffsetRange]()
dataStream
  .transform { 
    rdd =>
      offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
      rdd
   }
   .foreachRDD(rdd => { })

Here dataStream is a direct stream(DStream[String]) created using KafkaUtils API something like :

KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, Set(source_schema+"_"+t)).map(_._2)

If somebody can help me understand what I am doing wrong here. transform is the first method in the chain of methods performed on datastream as mentioned in the official documentation as well

Thanks.

Yuval Itzchakov
  • 146,575
  • 32
  • 257
  • 321
taransaini43
  • 184
  • 6
  • 17

1 Answers1

8

Your problem is:

.map(._2)

Which creates a MapPartitionedDStream instead of the DirectKafkaInputDStream created by KafkaUtils.createKafkaStream.

You need to map after transform:

val kafkaStream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, Set(source_schema+""+t))

kafkaStream
  .transform { 
    rdd => 
      offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
      rdd
  }
  .map(_._2)
  .foreachRDD(rdd => // stuff)
Yuval Itzchakov
  • 146,575
  • 32
  • 257
  • 321
  • also, while trying to create direct stream using offsets, I am encountering an error.
    val fromOffsets : (TopicAndPartition, Long)= TopicAndPartition(metrics_rs.getString(1), metrics_rs.getInt(2)) -> metrics_rs.getLong(3)
    KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder,(String, String)](ssc,kafkaParams,fromOffsets,messageHandler)
    where, val messageHandler = (mmd: MessageAndMetadata[String, String]) => mmd.message.length and metrics_rs is the result set from which I am fetching the offsets map. It says too many type arguments error
    – taransaini43 Sep 12 '16 at 12:49
  • How to read offsetRanges in my below code. I am using repartition. val numPartitionsOfInputTopic = 2 val streams = (1 to numPartitionsOfInputTopic) map { _ => KafkaUtils.createDirectStream[String, String]( ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams) ).map(_.value()) } val unifiedStream = ssc.union(streams) val sparkProcessingParallelism = 1 unifiedStream.repartition(sparkProcessingParallelism) more details in https://stackoverflow.com/questions/49344461/spark-kafka-streaming-commitasync-error – Gnana Mar 18 '18 at 21:49