0

I am trying to store last read record of my source in a separate topic in kafka for a stream source. How can i achieve this with spring cloud data flow stream app. Any suggestion wud be of great help..

shelzz
  • 3
  • 2

1 Answers1

0

Spring Cloud Stream applications can support multiple destinations.

You can add a second output destination and send a message it it.

I want to use the RDBMS as a source. The current JDBC app starter needs an extra flag in the source table to mark the row as read..but most of the scenarios, that wont be possible..So I am trying to build it based on timestamp.. So I will be storing the last read timestamp in a separate topic.. and each time i will use this timestamp, to continue reading from the RDBMS(incremental load)

You can consume from the topic during startup to get the initial starting value.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • thanks for the response..The purpose is to continue from the point where it stopped, each time ... so if i use a second output destination, as per my understanding, i wont be able to read from it, for continuing the source stream.. please correct me if wrong.. – shelzz Apr 04 '17 at 13:06
  • The binder already takes care of committing the offsets so you start from where you left off. – Gary Russell Apr 04 '17 at 13:56
  • I want to use the RDBMS as a source. The current JDBC app starter needs an extra flag in the source table to mark the row as read..but most of the scenarios, that wont be possible..So I am trying to build it based on timestamp.. So I will be storing the last read timestamp in a separate topic.. and each time i will use this timestamp, to continue reading from the RDBMS(incremental load).. – shelzz Apr 04 '17 at 15:13
  • Perhaps you don't understand the software; this indeed is an answer - a bit terse, perhaps; I'll enhance it. – Gary Russell Apr 05 '17 at 17:08