0

I am trying to run flink application on cluster. Application deployed successfully and I can see jobmanger and taskmanager are running and resource registration is done successfully.

Application need dummy event and it is working fine and sql query is also returning the result (by taskmanager).

Problem: Application receiving events from message hub but these messages are with jobmanager only and to coming to taskmanager for execution in cluster (standalone). But it works fine with intelliJ.

I understand if it would have kafka then addSource(new FlinkKafkaConsumer()) work but for me I have to register callback and then message comes onEvent.

is there any mechanism which can help to send events to taskmanager from jobmanager in cluster?

Ashutosh
  • 33
  • 8

1 Answers1

0

Events should not be processed on the JobManager at all. A possible reason for that behavior is that you start the application with a local executor. Could you double-check how you create your stream environment?

Arvid Heise
  • 3,524
  • 5
  • 11
  • thanks, I am using "final StreamExecutionEnvironment executionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment()" . To recieve events I have to configure application listner and I think that listner becomes the part of jobmanager.(just guessing ) – Ashutosh Dec 03 '20 at 09:15
  • @Ashutosh, Could you please explain if possible, why you need to register callback and how do you consume events ? Thank you. – Mikalai Lushchytski Dec 03 '20 at 09:51
  • It feels like you don't use a proper source but some custom solution that only consumes events while creating the DAG. It's probably only working in local executor at all. – Arvid Heise Dec 03 '20 at 10:17