Spring Cloud Task allows a user to develop and run short lived microservices using Spring Cloud and run them locally, in the cloud, even on Spring Cloud Data Flow. Just add @EnableTask and run your app as a Spring Boot app (single application context).
Questions tagged [spring-cloud-task]
261 questions
1
vote
1 answer
Duplicated port of child tasks in Spring Cloud Data Flow
When I launch new task (Spring Batch Job) using Spring Cloud Data Flow, I see that SCDF auto initialize Tomcat with some "random" ports but I do not know if there ports are created randomly or following any rule of the framework?
Therefore, I…

DM CHAU
- 67
- 6
1
vote
2 answers
How to get the execution id of a restarted job in Spring Cloud Data Flow
We have created a Spring Batch job to be executed in Spring Cloud Data Flow through Spring Cloud Task (a simple task, it only executes the job). The execution has been checked both with the UI and the REST API, and everything is OK in an ideal case.…

RLS
- 65
- 2
- 10
1
vote
1 answer
DeployerPartitionHandler class throwing null pointer exception for multiple remote partition steps
I am using spring batch spring cloud task to automate data read/write (read from file and store in MongoDb). In my use case i have 2 (will add 1 more step after successfully achieve 2) steps. I am trying to use remote partitioning integrate spring…

bhasky
- 73
- 1
- 5
1
vote
1 answer
Spring Batch test fails after completion because context is not active
I have created a simple job in Spring Batch with Spring Boot to be executed as a task with Spring Cloud Task (all in STS4). If I execute it as a Spring Boot App, te execution is correct and without problems, but if I compile the project or try to…

RLS
- 65
- 2
- 10
1
vote
1 answer
How should I slice and orchestrate a configurable batch network using Spring Batch and Spring Cloud Data Flow?
We would like to migrate the scheduling and sequence control of some Kettle import jobs from a proprietary implementation to a Spring Batch flavour, good practice implementation.
I intend to use Spring Cloud Data Flow (SCDF) server to implement and…

leo
- 3,528
- 3
- 20
- 19
1
vote
1 answer
Composed task custom condition
Is it possible to create a custom Exit status in spring cloud data flow?
Let's say i have the following:
I saw an examples for FAILED and UNKNOWN, so I've created 2 custom conditions Worked & Generated.
Assuming this approach is possible - How do i…

Fima Taf
- 929
- 9
- 22
1
vote
1 answer
Scheduled task in SCDF with Kubernetes persistent volume
I'm trying to run a Task in SCDF on minikube. This task extracts data from a database and write them into a file. So I use a local persistent volume I configured in Kubernetes Dashboard to get my file.
It works fine when I run a simple execution…

CEDDM
- 19
- 5
1
vote
0 answers
SCDF: Restart and resume a composed task
SCDF Composed Task Runner gives us the option to turn on the --increment-instance-enabled. This option creates an artificial run.id parameter, which increments for every run. Therefore the task is unique for Spring Batch and will restart.
The…

Daniel Yu
- 11
- 3
1
vote
1 answer
Running Spring Cloud Dataflow Task in REST with arguments
I need to run a Spring Cloud DataFlow Task using a Rest call. In addition, through Rest I would like to pass an argument. I'm browsing sample paths on localhost:9393 Dataflow server, but I don't see how you can run the ribbon along with the…

xampo
- 369
- 1
- 7
- 22
1
vote
0 answers
How to fix "Failed to process @BeforeTask or @AfterTask annotation because: Task with name "application-1" is already running
I have a Spring Cloud Task application setup to use spring.cloud.task.single-instance-enabled=true. When using this option, a lock record is created in the TASK_LOCK repository table and my task completes successfully. This lock record remains…

Charlie
- 11
- 1
1
vote
1 answer
Passing parameters between Spring-cloud-dataflow composite task
I built a composite task in spring-cloud-dataflow.
It is working.
Now i need to pass an output of my first task (FirstCloudTask) to second task (SUCCESS_TASK).
How can i pass an input parameter to my send task on completion of my first task ?…

Sivaraj Velayutham
- 187
- 2
- 10
1
vote
1 answer
Spring Boot Start Application Error when trying to configure seperate datasource for spring batch schema and application
Our application uses Oracle 11.2 as database. Because did not wanted to mix 'spring batch metadata' tables with regular application ones, got a new schema created. But, when trying to configure both separate datasources, keep getting below error :
…

Arpit S
- 137
- 2
- 10
1
vote
1 answer
How to implement a multi-tenant database for spring cloud data flow
We would like to implement a multi-tenant solution for SCDF for which each tenant may have unique task definitions / etc. Ideally we only want a single SCDF server (as opposed to setting up an SCDF server for each tenant), as pictured:
Is this…

GaZ
- 2,346
- 23
- 46
1
vote
2 answers
Spring Cloud Task deployed on PCF fails to exit on TaskExecution end
I am trying to host a Spring cloud task app on PCF and run the task hourly using PCF scheduler's CRON jobs. However, as part of the task, I have to publish a message onto a RabbitMQ exchange. The RabbitMQ instance is a RabbitMQ on PCF service which…

sohamangoes
- 129
- 2
- 7
1
vote
2 answers
How can Spring Cloud Dataflow Server use new tables( with custom prefix ) created for Spring batch and Spring cloud task?
I have created spring cloud task tables i.e. TASK_EXECUTION, TASK_TASK_BATCH with prefix as MYTASK_ and spring batch Tables with prefix MYBATCH_ in oracle database.
There are default tables also there in the same schema which got created…

DipikaV
- 11
- 1