-1

My requirement is to get a file from SFTP and make it available client to download. I am forcefully asked to do that using spring cloud data flow.

In the documentation, I saw that there is an SFTP to the JDBC File Ingest tutorial (https://dataflow.spring.io/docs/recipes/batch/sftp-to-jdbc/).

So my question is can we transfer a file through spring cloud data flow rather than reading the file and inserting it into the databae?

Thanks, Dasun.

Dasun
  • 602
  • 1
  • 11
  • 34

1 Answers1

0

Yes you can. It’s similar to the sftp to jdbc example which downloads the file to a shared file system from which the batch job reads it. You can create a simple pipeline like sftp | s3 or sftp l file or sftp l sftp, depending on your specific use case.

dturanski
  • 1,723
  • 1
  • 13
  • 8
  • Thanks for the comment, I am very new to this, Can you please provide some good readings? My use case is when a client request file, the file should select from SFTP and need to available to client to download, pipeline should created with spring data flow. – Dasun Jul 27 '21 at 18:55
  • How will the client request the file? Does it exist already? Does the client know the file path on ftp? and from where will they download the file? Can they just log in to the ftp server and get it? – dturanski Jul 28 '21 at 19:56
  • Yes, the file request is already existing. it's done through a form submission. once he submits the form with the file name and if the file is available in SFTP, that file needs to be available client to download. What is the best approach here? The requirement is to use Spring cloud data flow. – Dasun Jul 29 '21 at 04:10
  • Using spring cloud data flow is not a requirement. It’s an implementation, and maybe not the best choice here. Your requirement is to make the file available. When the file is in SFTP, it is already available if the client can log in. Do you need to email the client when it is ready? – dturanski Jul 30 '21 at 11:00
  • Can we read the file and get the byte array when the client request data using spring data flow? This is R & D-related project. I also don`t have any idea where i should use Spring data flow. – Dasun Aug 01 '21 at 19:06
  • I think, suppose files are stored in S3, and when a client requests a data file we can create a pipeline that will transfer the file to the client's speficif S3 bucket. So that S3 | S3using cloud data flow. – Dasun Aug 01 '21 at 19:11
  • S3 source is triggered by new files appearing in a bucket. This sounds like you want an Http source, the client UI would post an HTTP request to the source with the file name, etc, then you need to write a custom processor to determine the destination bucket, and send a payload to the s3 sink. The sink is configured with a `bucket-expression`property that takes a SPeL expression to dynamically evaluate the bucket for each request. This can be something like `headers[bucket_name]` See https://github.com/spring-cloud/stream-applications/tree/main/applications/sink/s3-sink for more details. – dturanski Aug 02 '21 at 22:06
  • I do not clearly understand what you are saying. Do I have to implement ETL with data flow? . Once the client request data, data will extract from a file that is stored in S3 and then transform and sink into a client-side folder. So that clients can consume. Is that the thing you are mention here? – Dasun Aug 05 '21 at 17:42