3

I wrote a flink batch job in flink 1.11.1. After job finishes successfull y, I want to do something like calling a http service.

I added a simple job listener to hook job status. The problem is when when kafka sink operator throws a error, job listener does not triggered. I expect when my job failed, it should trigger my job listener and print fail log.

How can I be sure that the job is done successfully or not?

Any help will be appreciated.

val env = StreamExecutionEnvironment.getExecutionEnvironment
env.registerJobListener(new JobListener {
      override def onJobSubmitted(jobClient: JobClient, throwable: Throwable): Unit = {
        if (throwable == null) {
          log.info("SUCCESS")
        } else {
          log.info("FAIL")
        }
      }

      override def onJobExecuted(jobExecutionResult: JobExecutionResult, throwable: Throwable): Unit = {

        if (throwable == null) {
          log.info("SUCCESS")
        } else {
          log.info("FAIL")
        }
      }
    })

    env.createInput(input)
      .filter(r => Option(r.token).getOrElse("").nonEmpty)
      .addSink(kafkaProducer)
mstzn
  • 2,881
  • 3
  • 25
  • 37

1 Answers1

0

If you try to run the job on cluster, you can view your logger message and stdout in the console with your job id. Please refer the attached screenshot,

The default url could be http://localhost:8081 if your run on local cluster.

Again, the below is not correct approach to check your job is success or not.

if (throwable == null) {
          log.info("SUCCESS")
        } else {
          log.info("FAIL")
        }

enter image description here

prostý člověk
  • 909
  • 11
  • 29
  • I want to get job status in the job itself – mstzn Aug 28 '20 at 11:06
  • The you need implement runtime ```webmonitor``` and please refer https://ci.apache.org/projects/flink/flink-docs-release-1.11/api/java/org/apache/flink/runtime/messages/webmonitor/JobDetails.html – prostý člověk Aug 28 '20 at 11:10