I want to know the best practice to ftps batch of new documents. Working on a service which which will take batch of documents and ftp it to remote server. I will maintain transaction log of each document which arrived for reporting purposes for example how many docs arrived in a particular day? How many couldn't be transferred due ftp connection failure, etc. In case the app fails for some reason, it should be able to process from where it left.
The possibilities I can think of is below:
Load the documents in database as well along with transaction details for each doc so that upon failure recovery it can start from where it left of.
Store the docs in local filesystem and write a fileWatcher on this folder to ftp.
Ftp the docs as soon as it arrives. In that case if lots of docs arrives, it might result in OutOfMemoryError I think.
which is good approach in terms of error recovery and non-memory intensive?