I'm trying to run this simple example where data from a Kafka topic are filtered out: https://www.talend.com/blog/2018/08/07/developing-data-processing-job-using-apache-beam-streaming-pipeline/
I have a similar setup with a localhost broker with default settings, but i can't even read from the topic.
When running the application it gets stuck in a infinite loop and nothing happens. I've tried giving gibberish url for my broker to see if it's even able to reach to them - it's not. The cluster is up and running and i'm able to add messages to the topic. Here is where i specify the broker and the topic:
pipeline
.apply(
KafkaIO.<Long, String>read()
.withBootstrapServers("localhost:9092")
.withTopic("BEAM_IN")
.withKeyDeserializer(LongDeserializer.class)
.withValueDeserializer(StringDeserializer.class)
)
I don't see any errors and there is nothing written to the output topic.
When debugging, I see it's stuck in this loop:
while(Instant.now().isBefore(completionTime)) {
ExecutorServiceParallelExecutor.VisibleExecutorUpdate update = this.visibleUpdates.tryNext(Duration.millis(25L));
if (update == null && ((State)this.pipelineState.get()).isTerminal()) {
return (State)this.pipelineState.get();
}
if (update != null) {
if (this.isTerminalStateUpdate(update)) {
return (State)this.pipelineState.get();
}
if (update.thrown.isPresent()) {
Throwable thrown = (Throwable)update.thrown.get();
if (thrown instanceof Exception) {
throw (Exception)thrown;
}
if (thrown instanceof Error) {
throw (Error)thrown;
}
throw new Exception("Unknown Type of Throwable", thrown);
}
}
In the isKeyed(PValue pvalue) method within the ExecutorServiceParallelExecutor class.
What am I missing?