Questions tagged [spring-cloud-task]

Spring Cloud Task allows a user to develop and run short lived microservices using Spring Cloud and run them locally, in the cloud, even on Spring Cloud Data Flow. Just add @EnableTask and run your app as a Spring Boot app (single application context).

261 questions
0
votes
1 answer

Spring Cloud Data Flow : Value too long for column "TASK_NAME" (DeployerPartitionHandler with Spring Batch)

I have a simple Spring Batch Job running Kubernetes as a Spring Cloud Task. This job uses Spring Batch Partitioning to further launch partitioned steps as task pods on the same Kubernetes cluster. Main job : (Relevant parts) @Bean public Job…
0
votes
1 answer

Spring Cloud Data Flow : Unable to launch multiple instances of the same Task

TL;DR Spring Cloud Data Flow does not allow multiple executions of the same Task even though the documentation says that this is the default behavior. How can we allow SCDF to run multiple instances of the same task at the same time using the Java…
Chetan Kinger
  • 15,069
  • 6
  • 45
  • 82
0
votes
0 answers

SCDF + Spring Batch : JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={run.id=1}

SCDF is launching job as: Command to be executed: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.252.b09-2.el7_8.x86_64/jre/bin/java -jar /opt/scdf-batches/foo/foo-1.0.26.jar --spring.cloud.task.executionid=23 Error: java.lang.IllegalStateException: Failed…
0
votes
1 answer

Non-option arguments with Spring Cloud Data Flow

I'm trying to run an application on Spring Cloud Data Flow that requires a non-option argument (e.g. the command would be java -jar task.jar non-option-argument --optionArgument=option etc.) The dashboard validation seems to prevent this from being…
bmarcj
  • 51
  • 2
  • 6
0
votes
1 answer

How to use static spring cloud stream url for launching spring cloud tasks?

Platform used : Kubernetes. I have an issue with Spring cloud stream url. I am launching my spring cloud tasks using spring cloud stream. Streams are deployed in kubernetes platform. Stream contains http-kafka as source and taskLauncerKafka as sink.…
0
votes
0 answers

Consider defining a bean of type 'org.springframework.transaction.jta.JtaTransactionManager' in your configuration

I am developing a spring cloud task application using spring cloud task and spring batch libraries. I also created a spring boot library containing spring cloud task and spring batch configurations and added this library as a dependent on spring…
0
votes
1 answer

Automatic job restart

I have a job that can take up to several hours. It is possible that for some reason (like out of memory, or cluster rebalance) it just fails. The problem is that the job is usually run overnight, and someone needs to check on it in the morning, and…
0
votes
0 answers

Step ExecutionContext not promoted using Spring Cloud Task on Spring Cloud Data Flow

I successfully deployed a remote partitioned job using Spring Cloud Data Flow and Spring Cloud Task; the installation is based on Kubernetes, so I added the Kubernetes implementation of Spring Cloud Deployer to the project. But it seems that it's…
0
votes
1 answer

Forbidden error using paritioned job with Spring Cloud Data Flow on Kubernetes

I want to implement a remote partitioned job using Spring Cloud Data Flow on Kuberentes. The Skipper server is not installed because I just need to run tasks and jobs. I modified the partitioned batch job sample project using…
0
votes
1 answer

Getting null pointer exception in launchWorker for spring batch running through spring cloud task using DeployerPartitionHandler

I'm using DeployerPartitionHandler (local variant) to partition my Spring batch job. When I run my job, I'm getting a null pointer exception in the launch step of the worker as below at…
TVB
  • 1
  • 1
0
votes
0 answers

Spring Cloud Data Flow- Task Execution fails with error ErrImgPull

Added the Task Application using a Docker image by following the syntax : docker://url_docker_img_for_task_app_from_private_repo. This is successful. Next, created a task and executed it. Looking at the logs for the App Pod created by SCDF, the…
0
votes
1 answer

Dynamically change maxWorkers with DeployerPartitionHandler

Based on no. of partitions returned can the maxWorkers changed dynamically(at runtime) when using DeployerPartitionHandler Regards, Balu UPDATE Please find my use case. The batch execution starts for a normal business day with maxWorkers as "4" and…
Balu R
  • 87
  • 1
  • 1
  • 10
0
votes
1 answer

How to post a message to a destination from a spring cloud data flow task?

Is there a correct/preferred way to send a message from a Task to a Destination using Spring Cloud Data Flow? We have an existing stream with destinations, and would like a scheduled task to also feed messages into the stream via one of the…
0
votes
1 answer

SpringBoot + Batch + Cloud Task @EnableTask annotation with single datasource causes "Sequence does not exist" Issue

Currently I'm using the SpringBoot ( 2.3.3 Version ) and Batch ( 4.2.4 Version ) and spring-cloud-starter-tsk ( 2.2.3 Version ) with Single Datasource ( oracle ). My BatchConfiguration extends the DefaultBatchConfigurer and made setDataSource. Now…
0
votes
1 answer

How does Spring Boot app exit after it runs a Cloud Task?

Related to this article: https://www.baeldung.com/spring-cloud-task and this example: https://github.com/spring-cloud/spring-cloud-task/blob/master/spring-cloud-task-samples/timestamp How does the Spring Boot app exit after it runs the task? Where…