0

I have a small spring-bach application and deploy it locally in scdf. It works fine so far, taskexecutions run and the metadata tables are filled. Except for one - TASK_TASK_BATCH! I see the log output of the TaskBatchExecutionListener with "The job execution [id] was run within the task execution [id]" but no records are written by taskBatchDao.saveRelationship. So I wrote my own JobListener and wrote the records myself via a repository in a transaction to which I explicitly assign the transaction manager of the metadatasource. But this cannot be the solution. Thanks in advance

Henning
  • 3,055
  • 6
  • 30
dgerhardt
  • 1
  • 1

1 Answers1

0

You need to expose your Job as a bean. Then the TaskBatchExecutionListenerBeanPostProcessor will add the TaskBatchExecutionListener as a job listener to your job.

For all other features of Spring Batch or Spring Cloud Task, it is not necessary that each job is a bean. But if it's not, you'll need to add the TaskBatchExecutionListener to the jobs yourself.

If you have the artifact spring-cloud-task-batch on the classpath, and use the annotation @EnableTask, then your application context contains a TaskBatchExecutionListener bean that you can autowire.

Henning
  • 3,055
  • 6
  • 30
  • Thank you. I will definitely check again. However, as I said, the org.springframework.cloud.task.batch.listener.TaskBatchExecutionListener logs out that it has a task and a jobid, only the taskdao in the listener does not write to the table. – dgerhardt Jul 05 '22 at 13:27
  • Okay. If the listener is indeed registered, then the problem is something else. If you debug it, is the `TaskBatchDao` in the `TaskBatchExecutionListener` of type `JdbcTaskBatchDao`? Or is it of type `MapTaskBatchDao`? – Henning Jul 05 '22 at 16:14