0

I have a dedicated server in Germany with 120GB data, since I have bought a new dedicated server in U.S, I would like to transfer all the files to new server through FTP connection, to achieve this I'm able using the following wget command in my new server:

nohup wget -m --user=myusername --password=mypassowrd ftp://ftp.mysite.xyz > wget_backup_transfer_log.log 2>&1 &

I would like to know is there any better way to do it and is the above command stable regarding to these huge files data?

Thanks in Advance...

P.S Both servers running CentOS 6.5

iSun
  • 101

1 Answers1

1

Using tmux/Screen would be a more preferable way instead of nohup. You can always reattach the terminal, in case you loose the connection.

For the file transfer itself I would recommend using SSH+Rsync. Rsync can resume files transfers, and it will be encrypted too.

Try something like: rsync -av --partial server1:/my/dir server2:/this/dir

  • Agreed. As a poor-man's approach ("I don't want to comprehend the `rsync` manual page") a simple `ssh 'tar -cf - -C /path/to/my/site/there . | gzip -c9' | gunzip -c | tar -xf - -C /path/to/my/site/here` would work just OK. One could also plug `pv` after `gunzip -c9` to get a live view of the data transfer. – kostix Sep 28 '15 at 18:30
  • @iSun, to be more specific, you can use the oft-overlooked feature of SSH which actually provides you with two streams, input and output, connected to a process (or process pipeline) being run remotely. So my encantation makes SSH spawn a tar+gzip combo remotely and connect its output to the input of the gunzip+untar combo running locally -- streaming the compressed data over the network -- safe and encrypted. – kostix Sep 28 '15 at 18:33