0

I both download kafka_2.10-0.10.0.1 into my windows and my linux machine(I have a cluster which has 3 linux machines ,192.168.80.128/129/130).So , i use my windows machine as a kafka client and linux machines as kafka servers. I try to produce msg from my windows to remote kafka server, command and response is as below:

F:\kafka_2.10-0.10.0.1\kafka_2.10-0.10.0.1\bin\windows>kafka-console-pr
oducer.bat --broker-list 192.168.80.128:9092 --topic wuchang
DADFASDF
ASDFASF
[2016-11-08 22:41:30,311] ERROR Error when sending message to topic wuchang with
 key: null, value: 8 bytes with error: (org.apache.kafka.clients.producer.intern
als.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Batch containing 2 record(s) ex
pired due to timeout while requesting metadata from brokers for wuchang-0
[2016-11-08 22:41:30,313] ERROR Error when sending message to topic wuchang with
 key: null, value: 7 bytes with error: (org.apache.kafka.clients.producer.intern
als.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Batch containing 2 record(s) ex
pired due to timeout while requesting metadata from brokers for wuchang-0

I am very sure that my kafka cluster is OK because I run produce and consume command directly on my linux server successfully.

Ofcource , comsume msg from remote kafka server also failed:

F:\kafka_2.10-0.10.0.1\kafka_2.10-0.10.0.1\bin\windows>kafka-console-co
nsumer.bat --bootstrap-server 192.168.80.128:9092 --topic wuchang --from-beginni
ng --zookeeper 192.168.80.128:2181
[2016-11-08 22:56:43,486] WARN Fetching topic metadata with correlation id 0 for
 topics [Set(wuchang)] from broker [BrokerEndPoint(1,vm02,9092)] failed (kafka.c
lient.ClientUtils$)
java.nio.channels.ClosedChannelException
        at kafka.network.BlockingChannel.send(BlockingChannel.scala:110)
        at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:80)
        at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncP
roducer.scala:79)
        at kafka.producer.SyncProducer.send(SyncProducer.scala:124)
        at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59)
        at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:94)
        at kafka.consumer.ConsumerFetcherManager$LeaderFinderThread.doWork(Consu
merFetcherManager.scala:66)
        at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:63)

Also , I want to try kafka java api example on my windows machine , also failed without any error msg,my java code is :

package com.netease.ecom.data.connect.hdfs;


import com.twitter.bijection.Injection;
import com.twitter.bijection.avro.GenericAvroCodecs;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.avro.generic.GenericRecord;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;

import java.util.Properties;

public class SimpleAvroProducer {

    public static final String USER_SCHEMA = "{"
            + "\"type\":\"record\","
            + "\"name\":\"myrecord\","
            + "\"fields\":["
            + "  { \"name\":\"str1\", \"type\":\"string\" },"
            + "  { \"name\":\"str2\", \"type\":\"string\" },"
            + "  { \"name\":\"int1\", \"type\":\"int\" }"
            + "]}";

    public static void main(String[] args) throws InterruptedException {
        Properties props = new Properties();
        props.put("bootstrap.servers", "192.168.80.128:9092");
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");

        Schema.Parser parser = new Schema.Parser();
        Schema schema = parser.parse(USER_SCHEMA);
        Injection<GenericRecord, byte[]> recordInjection = GenericAvroCodecs.toBinary(schema);

        KafkaProducer<String, byte[]> producer = new KafkaProducer<>(props);

        for (int i = 0; i < 1000; i++) {
            GenericData.Record avroRecord = new GenericData.Record(schema);
            avroRecord.put("str1", "Str 1-" + i);
            avroRecord.put("str2", "Str 2-" + i);
            avroRecord.put("int1", i);

            byte[] bytes = recordInjection.apply(avroRecord);

            ProducerRecord<String, byte[]> record = new ProducerRecord<>("mytopic", bytes);
            producer.send(record);

            Thread.sleep(250);

        }

        producer.close();
    }
}

Yes , my code is want to send avro data to kafka , also , it failed without any errors.

one of my kafka server.properties on my linux machines is :

wuchang
  • 3,003
  • 8
  • 42
  • 66
  • Do you specify hostname or IP address for 'listener' setting? And, is the Linux server an IaaS machine which might need Kafka to set advertised.listeners? – amethystic Nov 08 '16 at 23:22
  • My Linux machine cluster is just a three virtual machines which live in VMWare workstation installed in my windows. – wuchang Nov 09 '16 at 02:57
  • The exception complains it failed to retrieve metadata for broker from the cluster, so check your listener setting to make sure it is able to connect to brokers. – amethystic Nov 09 '16 at 03:03
  • 1
    I didn't set the listeners or advertised.listeners.They are all commented.So , should I uncomment them?Do you mean to modify the config on my linux servers or the config on my windows client? – wuchang Nov 09 '16 at 03:32

0 Answers0