I'm reading into a local folder where various files are written, if file match a pattern I send those files to a remote server via a second script (transfer_file.sh), if transfer is completed successfully (via SCP) I then use rsync to move the processed file to a local folder as local backup using --remove-source-files flag which removes the file from the monitoring folder. It could be that there are a huge number of files being transferred hence rsync hasn't completed, after sleep is done the first cycle it may retry to send the files which already read in previous loop. How to unlist or not process files already passed to transfer_file.sh ? I have increased the sleep time, but need to have a more cleaner solution in case suddenly I processed hundreds of files.
if [ -d $MONITOR_FOLDER ] ; then
while [ 1 ]; do
echo "$(date +%c) monitor() |Main| Monitoring local repository"
for LOG_FILE in `ls $MONITOR_FOLDER$LOG_PATTERN 2> /dev/null`
do
sh transfer_file.sh $LOG_FILE &
done
sleep 10
done
fi