Can I use spring-cloud-dataflow(kubernetes) for recursive tasks? I need to create new instance of flow depending on current results(recursive). I need distributed processing of tasks.
What you're looking for is "composed task" feature that we had in Spring XD and it is in the roadmap to port it over to Spring Cloud Data Flow. In the meantime, you could programmatically invoke the Task via: REST-APIs or Dataflow Template based on an upstream task result.
1) Do I need to use message-source? As I unerstand, dataflow uses own message broker(rabbit, kafka).
SCDF doesn't have its own message broker. What we provide is a binder abstraction and there's implementation for Kafka, Rabbit, Google pub/sub and others. You could choose a binder implementation that works best for your requirements.
Do I need to install another broker or I can create queue in dataflows broker?
SCDF doesn't require a broker, but in turn, it is the "apps" included in streaming pipeline that need them. Please watch this recording from SpringOne conference earlier this year. This covers the basics and architecture specifics.
2) Can I do that without queue?
Stream processing in SCDF requires a messaging middleware. You have the option to choose a middleware of your choice. By default, we release streams apps for both rabbit and kafka.
Stream vs task?
It is unclear what you mean by this.