1

How to write every record to multiple kafka topics in Spark Streaming 2.3.1? other words say I got 5 records and two output kafka topics I want all 5 records in both output topics.

The question here doesn't talk about structured streaming case. I am looking specific for structured streaming.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
user1870400
  • 6,028
  • 13
  • 54
  • 115

1 Answers1

0

Not sure if you are using java or scala. Below is code to generate message to two different topic. You'll have to call

dataset.foreachPartition(partionsrows => {
      val props = new util.HashMap[String, Object]()
      props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootStrapServer)
      props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
        "org.apache.kafka.common.serialization.StringSerializer")
      props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
        "org.apache.kafka.common.serialization.StringSerializer")
      val producer = new KafkaProducer[String, String](props)
      partionsrows.foreach(row => {
        val offerId = row.get(0).toString.replace("[", "").replace("]", "")
        val message1 = new ProducerRecord[String, String]("topic1", "message")
        producer.send(message1)
        val message2 = new ProducerRecord[String, String]("topic2",  "message")
        producer.send(message2)
      })
    })
Rishi Saraf
  • 1,644
  • 2
  • 14
  • 27