I have a Kafka Producer code written in java that writes kafka messages. And a consumer code that receives these messages.
Is it possible to write theses received messages by consumer to any text file in java.
I have a Kafka Producer code written in java that writes kafka messages. And a consumer code that receives these messages.
Is it possible to write theses received messages by consumer to any text file in java.
If you're writing your own consumer you should include the logic to write to file in the same application. Using the prepackaged console consumer you could just pipe it to a file. For example: kafka-console-consumer > file.txt
Another (code-free) option would be to try StreamSets Data Collector an open source Apache licensed tool which also has a drag and drop UI. It includes built in connectors for Kafka and a variety of data formats.
*full disclosure I'm a committer on this project.
Thanks Guys,
I am able to achieve it. Once the data is received at the consumer side, then it's just a common java code you have to write.
Below is the line in ode that prints the message to the console.
System.out.println(String.valueOf(messageAndOffset.offset()) + ": " + new String(bytes, "UTF-8"));
You can store all the message to the String and print all at a time to the file.
System.out.println(String.valueOf(messageAndOffset.offset()) + ": " + new String(bytes, "UTF-8"));
completMessage += new String(bytes, "UTF-8")+"\n";
new String(bytes, "UTF-8")+"\n";
contains actual message.
At last print all messages to file.
writeDataToFile(completMessage);
writeDataToFile
contains simple java code to write a string to file.
Thank you.
It is possible. Below is the working code for this.
package com.venk.prac;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.Collections;
import java.util.Properties;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.serialization.IntegerDeserializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import kafka.utils.ShutdownableThread;
public class FileConsumer extends ShutdownableThread {
private final KafkaConsumer<Integer, String> kafkaConsumer;
private String topic;
private String filePath;
private BufferedWriter buffWriter;
public FileConsumer(String topic, String filePath) {
super("FileConsumer", false);
Properties properties = new Properties();
properties.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
KafkaProperties.KAFKA_BROKER_SERVERS_PORTS_STRING);
properties.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "FileConsumer");
properties.setProperty(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
properties.setProperty(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
properties.setProperty(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000");
properties.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class.getName());
properties.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
kafkaConsumer = new KafkaConsumer<Integer, String>(properties);
this.topic = topic;
this.filePath = filePath;
try {
this.buffWriter = new BufferedWriter(new FileWriter(filePath));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
@Override
public void doWork() {
// TODO Auto-generated method stub
kafkaConsumer.subscribe(Collections.singletonList(this.topic));
ConsumerRecords<Integer, String> consumerRecords = kafkaConsumer.poll(1000);
try {
for (ConsumerRecord<Integer, String> record : consumerRecords)
buffWriter.write(record.value() + System.lineSeparator());
buffWriter.flush();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
@Override
public String name() {
// TODO Auto-generated method stub
return null;
}
@Override
public boolean isInterruptible() {
return false;
}
}