0

I have a cloudera Quickstart VM. i have installed Kafka parcels using Cloudera Manager and its working fine inside the VM using console based consumer and producer. But when i try to use java based consumer it does not produce or consume messages. I can list the topics. But i cannot consume messages. following is my code.

package kafka_consumer;

import java.util.Arrays;
import java.util.Properties;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.serialization.StringDeserializer;

public class mclass {
    public static void main(String[] args) {
        Properties props = new Properties();
        props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "10.0.75.1:9092");
        // Just a user-defined string to identify the consumer group
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "test");
        // Enable auto offset commit
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
        props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
        props.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
        props.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

        try (KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props)) {
            // List of topics to subscribe to
            consumer.subscribe(Arrays.asList("second_topic"));
            for (String k_topic : consumer.listTopics().keySet()) {
                System.out.println(k_topic);
            }
            while (true) {
                try {
                    ConsumerRecords<String, String> records = consumer.poll(100);
                    for (ConsumerRecord<String, String> record : records) {
                        System.out.printf("Offset = %d\n", record.offset());
                        System.out.printf("Key    = %s\n", record.key());
                        System.out.printf("Value  = %s\n", record.value());
                    }
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }

}

And following is the output of the code. while the console producer is producing messages but consumer is not able to receive it.

PS: i can telnet the port and ip of the Kafka broker. I can even list the topics. Consumer is constantly running without crashing but no messages is being consumed.

enter image description here

output

  • who produces anything on second_topic? – thst Dec 27 '19 at 00:45
  • I have another terminal running console producer for Kafka. it is producing messages. and i can consume those messages with another console based consumer but cant do that in java. – Justin Syrus Dec 27 '19 at 01:06
  • Dubbing shows that it gets stuck on the following line `ConsumerRecords records = consumer.poll(1000);` – Justin Syrus Dec 27 '19 at 01:16
  • How many records does the producer actually send? Is it flushing the records? Have you tried setting the consumer group offset to the beginning of the topic? Side question: do you really need the whole cloudera VM to run Kafka? – OneCricketeer Dec 27 '19 at 04:58
  • @cricket_007 right now i'm sending 10-15 records manually but in production its going to be a stream of data. I dont think its flushing records. – Justin Syrus Dec 27 '19 at 13:16
  • @cricket_007 Yes i did set the offset to read from the earliest offset . also My use case is a data stream pipeline which receives data from streaming sources and Spark streaming reads data from Kafka topic and finally writes it to the HBase. currently i'm using single node cluster (quickstart) for this. – Justin Syrus Dec 27 '19 at 13:17
  • What do you see when you run a `$ bin/kafka-consumer-groups.sh --bootstrap-server 10.0.75.1:9092 --describe --group test` in your other terminal? – mazaneicha Dec 27 '19 at 17:40
  • 1
    @mazaneicha Hi Thank you for your comment. I see following error. `Error: Executing consumer group command failed due to org.apache.kafka.common.errors.CoordinatorNotAvailableException: The coordinator is not available.` – Justin Syrus Dec 27 '19 at 20:00
  • If you want to use Spark Streaming, are you just testing a plain consumer first? Also, I believe there's a Kafka Connector for Hbase – OneCricketeer Dec 27 '19 at 22:08
  • @cricket_007 yes i'm just testing plain consumer first. – Justin Syrus Dec 27 '19 at 23:07
  • @mazaneicha i have solved the exception and now the output of `$ bin/kafka-consumer-groups.sh --bootstrap-server 10.0.75.1:9092 --describe --group test` is `Error: Consumer group 'test' does not exist.` – Justin Syrus Dec 27 '19 at 23:10
  • If the console consumer works, then that's a plain consumer, by the way. In any case, have you tried not mixing `props.put` and `props.setProperty`? – OneCricketeer Dec 27 '19 at 23:20
  • And what does `$ bin/kafka-topics.sh --bootstrap-server 10.0.75.1:9092 --describe --topic second_topic` show? – mazaneicha Dec 29 '19 at 20:19
  • It seems like your resolved the original issue, but did not remove irrelevant information after that, and also no additional info is provided to help resolve the issue. As such voting to close. – Dennis Jaheruddin Aug 02 '20 at 13:28

0 Answers0