1

I have a spring batch boot app which takes a flat file as input . I converted the app into cloud task and deployed in spring local data flow server. Next , I created a stream starting with File Source -> tasklaunchrequest-transform -> task-launcher-local which starts my batch cloud task app .

It looks like that the File does not come into the batch app . I do not see anything in the logs to indicate that.

I checked the docs at https://github.com/spring-cloud-stream-app-starters/tasklaunchrequest-transform/tree/master/spring-cloud-starter-stream-processor-tasklaunchrequest-transform

It says

Any input type. (payload and header are discarded)

My question is how do I pass the file as payload from File Source to the Batch app which seems to be a very basic feature.

any help is very much appreciated.

Michael Minella
  • 20,843
  • 4
  • 55
  • 67
shiva
  • 51
  • 5

1 Answers1

1

You'll need to write your own transformer that takes the data from the source and packages it up so your task can consume it.

Michael Minella
  • 20,843
  • 4
  • 55
  • 67
  • Thanks Michael, I wish spring team would add an option to send the payload . For now, I will create my own transformer . – shiva Mar 02 '18 at 05:51
  • after much investigation , found that tasklaunchrequest support only String . since my batch app is wrapped as a cloud task and there is no way to send a file as parameter while starting batch ..is there a way to copy the file in cloud foundry and pass the file location to the batch app which is also deployed in cf – shiva Mar 02 '18 at 10:07
  • mine is a use case where i am trying to create a spring cloud stream as follows - File/Ftp/Sftp/Http Source -> Batch App [wrapped as Cloud Task] – shiva Mar 02 '18 at 10:08
  • When running on CF, you actually want the batch job to download the file (so that it's there on a restart if needed). So in your case, you'd want the stream source to send where the file is to the batch job, then have the batch job do the actual download. I have an example of doing the download from S3 buckets in my S3JDBC batch job found here: https://github.com/mminella/S3JDBC – Michael Minella Mar 02 '18 at 16:00
  • Thanks Michael, I will try this out. – shiva Mar 03 '18 at 11:00
  • Hi Michael, I am now trying the same with spring integration instead of cloud stream. I created an integration flow with sftp stream inbound adapter.The flow will save the file contents into a table. I start this adapter from batch job step 1 . The step 2 will read the file contents from the table and process it. The problem that I have is , how to pause the step 1 till the integration flow is complete . The whole app is spring boot with spring batch and spring integration. The reason for not starting the job from integration flow is for job restartability. thanks. – shiva Mar 15 '18 at 14:50