1

I am trying to write to write to Confluent Cloud/Kafka from Dataflow (Apache Beam), using the following:

kafkaKnowledgeGraphKVRecords.apply("Write to Kafka", KafkaIO.<String, String>write()
                                .withBootstrapServers("<mybootstrapserver>.confluent.cloud:9092")
                                .withTopic("testtopic").withKeySerializer(StringSerializer.class)
                                .withProducerConfigUpdates(props).withValueSerializer(StringSerializer.class));

where Map<String, Object> props = new HashMap<>(); (i.e. empty for now)

In the logs, I get: send failed : 'Topic testtopic not present in metadata after 60000 ms.'

The topic does exist on this cluster - so my guess is that there is an issue with login, which makes sense as I couldn't find a way to pass the APIKey.

I did try various combinations to pass the APIKey/Secret I have from Confluent Cloud to auth with the props above but I couldn't find a working setup.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Pinguin Dirk
  • 1,433
  • 11
  • 18
  • "I did try various combinations to pass the APIKey/Secret" -> can you update your question to include these please – Robin Moffatt Jan 14 '20 at 14:59
  • 1
    https://stackoverflow.com/questions/53939658/cant-read-from-kafka-by-kafkaio-in-beam?rq=1 shows an example of connecting to Confluent Cloud from Beam - it's as a Consumer so you'd need to change it for the appropriate Producer config, but the properties should be the same – Robin Moffatt Jan 14 '20 at 15:01
  • thanks @RobinMoffatt - I tried the parameters similar to what linked to in the other answer - maybe I mixed up something. I will try tomorrow with the linked answer and report back with feedback here. Thanks already! – Pinguin Dirk Jan 14 '20 at 17:54
  • @RobinMoffatt thanks again for the pointers, I found a solution, see below – Pinguin Dirk Jan 17 '20 at 09:00

1 Answers1

3

Found a solution, thanks to the pointers in the comments of @RobinMoffatt below the question

Here's the setup I have now:

Map<String, Object> props = new HashMap<>()

props.put("ssl.endpoint.identification.algorithm", "https");
props.put("sasl.mechanism", "PLAIN");
props.put("request.timeout.ms", 20000);
props.put("retry.backoff.ms", 500);
props.put("sasl.jaas.config","org.apache.kafka.common.security.plain.PlainLoginModule required username=\"<APIKEY>\" password=\"<SECRET>\";");
props.put("security.protocol", "SASL_SSL");

kafkaKnowledgeGraphKVRecords.apply("Write to Kafka-TESTTOPIC", KafkaIO.<String, String>write()
    .withBootstrapServers("<CLUSTER>.confluent.cloud:9092")
    .withTopic("test").withKeySerializer(StringSerializer.class)
    .withProducerConfigUpdates(props).withValueSerializer(StringSerializer.class));

The key line I had wrong is the sasl.jaas.config (note the ; at the end!)

technogeek1995
  • 3,185
  • 2
  • 31
  • 52
Pinguin Dirk
  • 1,433
  • 11
  • 18