-1

I have some files in remote server. I need to copy those file from remote server to my local server.

Source path : pABCDWPPP@170.20.100.10:/SRC_path (dummy server name)
Target path : /data/TGT_path

i have tried this...

infa@vm-ppp-50:/data/TGT_path$ sftp -oPort=10000 pABCDWPPP@170.20.100.10:/SRC_path
SSH Server supporting SFTP and SCP
Connected to 170.20.100.10.
Changing to: /SRC_path
sftp> ls -l
-rw-------   1 200      100            17 Dec 04 15:15 test-20191204-1572921093125.csv
-rw-------   1 200      100           592 Dec 02 10:59 test-20191125-1574678916536.csv
-rw-------   1 200      100             9 Dec 04 15:15 t-20191204-1575253807720.csv
-rw-------   1 200      100            15 Dec 04 15:15 test-20191204-1575253807720.csv
-rw-------   1 200      100            17 Dec 04 15:16 test-20191204-1575426603349.csv
sftp> bye

I have to create a batch script which should copy those file from source path to target path.

file names can be different. but it has some patterns which i can use as a variable.

This is the syntax that can be used in the script:

sftp -oPort=10000 ${USERID}@170.20.100.10:${MAILBOXPATH} <<EOF
mget ${FILE}
bye
EOF

Can anyone help me to create the script file which can copy those file(s) in batch mode everyday. I have to execute the script in unix.

Compo
  • 36,585
  • 5
  • 27
  • 39
goldenbutter
  • 575
  • 2
  • 12
  • 25

2 Answers2

1

schedule a cron job to execute your script, please see scheduling-cron-jobs-with-crontab

davaj22251
  • 19
  • 4
1

Instead of SFTP, I would use rsync which is well adapted for this task:

#!/bin/bash
# backup.sh
rsync --modify-window=5 -rltzuv --delete [REMOTE_SERVER@REMOTE_DIR] [LOCAL_SERVER@LOCAL_DIR]

Then you can run a cron job as follows:

crontab crontab.sh

where:

# content of crontab.sh (scheduled to run everyday at 2am)
0 2 * * * /bin/sh /path/to/backup.sh