I am trying to write to write to Confluent Cloud/Kafka from Dataflow (Apache Beam), using the following:
kafkaKnowledgeGraphKVRecords.apply("Write to Kafka", KafkaIO.<String, String>write()
.withBootstrapServers("<mybootstrapserver>.confluent.cloud:9092")
.withTopic("testtopic").withKeySerializer(StringSerializer.class)
.withProducerConfigUpdates(props).withValueSerializer(StringSerializer.class));
where Map<String, Object> props = new HashMap<>();
(i.e. empty for now)
In the logs, I get: send failed : 'Topic testtopic not present in metadata after 60000 ms.'
The topic does exist on this cluster - so my guess is that there is an issue with login, which makes sense as I couldn't find a way to pass the APIKey.
I did try various combinations to pass the APIKey/Secret I have from Confluent Cloud to auth with the props
above but I couldn't find a working setup.