Questions tagged [spring-cloud-dataflow]

Spring Cloud Data Flow is a toolkit for building data integration and real-time data processing pipelines. Pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks. This makes Spring Cloud Data Flow suitable for a range of data processing use cases, from import/export to event streaming and predictive analytics.

Use this tag for questions about the Spring Cloud Data Flow project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Data Flow's Official Project Site

Spring Cloud Data Flow Website

Spring Cloud Data Flow's Github Repo

How to contribute

How to report issues

Related tags , , , , , .

1276 questions
3
votes
2 answers

Does spring-cloud-dataflow provide support for scheduling applications defined as tasks?

I have been looking at using projects built using spring-cloud-task within spring-cloud-dataflow. Having looked at the example projects and the documentation, the indication seems to be that tasks are launched manually through the dashboard or the…
Nigel R
  • 71
  • 1
  • 4
3
votes
1 answer

Spring cloud data flow - Micro services deployment

Team, Currently I am working on spring-xd and using as a runtime container for data analytics and yarn jobs. My questions are 1) Can I leverage the same environment setup which I used for spring-xd? 2) From the documentation,I read that it can be…
3
votes
1 answer

Spring Cloud Data Flow support of Swarm

Currently I can see that Spring Cloud Data Flow has these servers: Local, YARN, Cloud Foundry, Mesos, and Kubernetes; is there any plan for Swarm support?
cten
  • 41
  • 3
2
votes
0 answers

Exception while launching a compose task with Spring Cloud Data FLow

I am trying to launch a compose task with spring cloud data flow. The tasks are billrun and bilsetup mentioned in the docs. While launching I get the following exception: Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Could…
Aritra Roy
  • 23
  • 3
2
votes
0 answers

Spring Cloud Dataflow - Set max-connection-pool for Composed Task Runner

I've encountered an issue on Spring Cloud Dataflow when running multiple composed tasks at once. Hikari DataSource takes 10 connections from the connection pool by default. When running for example 10 composed tasks at once, this means 100…
2
votes
0 answers

How to get process id of spring cloud task triggered from Spring cloud dataflow

How to get process id of spring cloud task triggered from Spring cloud dataflow. If we triggered a job from Spring cloud dataflow an execution id will be assigned for every task ,but I need to get process id of triggered job. When we triggered…
Abhi Ram
  • 41
  • 1
2
votes
1 answer

Spring Cloud Data Flow: complex parallel processors scenario?

Is it possible to implement a complext parallel processors flow in spring cloud data flow using the preset modules? For example: Processor 1, 2, 3 are all preset modules (httpclient etc). Processor 1 and 3 will get the same message from source at…
user3908406
  • 1,416
  • 1
  • 18
  • 32
2
votes
0 answers

Spring Cloud Router Sink 3.0.2 requires Kafka Avro (De)Serializer

I'm using SCDF (2.5.3.RELEASE) + Apache Kafka + Avro for messages (de)serialization. SCDF stream definitions involve standard spring cloud app starter router-sink (version 2.1.5.RELEASE). Router-sink configuration is pretty simple: router: …
2
votes
0 answers

Automatically delete pods with status 'completed' periodically and across namespaces

I'm having Spring Cloud Dataflow deployed in multiple namespaces of my kubernetes cluster. Additionally, a task is registered there which is executed from time to time. Executing a Task in SCDF on kubernetes will create a pod for each execution, and…
Manu
  • 284
  • 2
  • 20
2
votes
0 answers

SpringCloud Dataflow Keycloak Angular 8 integration - 401 Unauthorized (sometimes(?))

I am working on a project with SpringBoot framework at backend and Angular8 at frontend. The connection between front and back is made with an nginx proxy server. I wanted to integrated with Keycloak and I used keycloak-angular v.6.1.0 library…
2
votes
1 answer

Task execution is not working after lunching the task in spring cloud data flow

I have created one Spring boot application with @EnablesTask annotation and try to print the arguments in log. package com.custom.samplejob; import org.springframework.boot.CommandLineRunner; import…
2
votes
1 answer

Violation of Foreign Key constraint in "task_metadata_fk" when launching a Task in SCDF Dashboard

I am trying to launch a task using the SCDF Dashboard. I am using CockroachDB as the underlying persistence layer with Hibernate ORM version 5.4.22.Final which is supported by the 2.7.0 release of SCDF as well as 2.8.0-SNAPSHOT. I am able to create…
2
votes
1 answer

Spring Cloud Data Flow Pod Cleanup

We are repeatedly seeing resource quota limitation issues in logs and Task jobs fail on the SCDF running on Kubernetes. Problem is, there are so many pods in "running" status even after they completed. I understand, SCDF does not delete the pods and…
2
votes
2 answers

How does one update the version of Hibernate that SCDF uses?

I am trying to get Spring Cloud DataFlow to work with CockroachDB as its persistence layer. The problem that I have is that CockroachDB does not support the PostgreSQL Large Object server-side functions (e.g. lo_create) which the default postgreSQL…
Louis
  • 51
  • 5
2
votes
0 answers

NullPointerException When Launching Task in Spring Cloud Dataflow - in Kubernetes Environment

I am setting up Spring Cloud Dataflow in Kubernetes Cluster environment. I faced this problem only on Kubernetes deployment. FYI, I am using server 2.7.0 (https://dataflow.spring.io/docs/2.7.0.SNAPSHOT/recipes/batch/batch-only-mode/) without…