1

I want upload the tar file simultaneous to a remote ftp. But this code doesn't work.

tar cvzf - /backup | openssl aes-256-cbc -salt -k "password" | split -b 100m | curl -u user:password ftp.site.com/backup.tar -T -

tara123
  • 75
  • 1
  • 10

1 Answers1

0

Try walking before you run, by which I mean, understand each individual command before chaining them into a pipeline.

The first problem I see is the use of split - it's not going to produce any output on stdout, as its job is to split input into files. So it's only writing to your current working directory, not to curl. Those multiple files will need to be handled differently.

So your one-line command of:

tar cvzf - /backup | openssl aes-256-cbc -salt -k "password" | split -b 100m | curl -u user:password ftp.site.com/backup.tar -T -

Needs to get translated to something with a loop like this:

tar cvzf - /backup | openssl aes-256-cbc -salt -k "password" | split -b 100m - bkup
for file in bkup*
do
    curl -u user:password ftp.site.com/$file -T $file
done
R Perrin
  • 471
  • 3
  • 8
  • thanks! Does it writes data to the local hdd? I have a ssd. I don't want it. – tara123 Apr 07 '14 at 21:37
  • Yes it will write the split files (starting with specified prefix "bkup") to the local disk. If you don't want to write to the local disk at all, then you cannot use the split program locally. – R Perrin Apr 08 '14 at 14:11
  • I'm not really sure what the underlying problem is that you're trying to solve. You want smaller files on the remote end? Does the remote ftp server not accept the large single file? Do you have options other than FTP for transfering to the remote? (Could you just rsync the tree you want to backup, rather than tarring it up?) – R Perrin Apr 08 '14 at 17:18
  • Yes, the remote ftp server not accept the large single file and my backup directory is 200 GB+ – tara123 Apr 08 '14 at 17:23