0

Normally I can sercure copy files from one machine to another using

> scp -oProxyJump=user@login.node.org ssh user@main.node.org:/home/user/my_files/* .

which is very slow for large data sets.

I was told that the machines I am using has a very fast link that can be accessed with wget. How do I perform the same file transfer using wget instead?

2 Answers2

1

If you have a fast, secure and stable network link between these two machines, you can use a combination of netcat and tar, like this.

On the destination machine, run:

nc -l 10000 | tar -C /destination/directory -xzf -

On the source machine, run:

tar -cz /source/directory | nc dst-machine-ip-address 10000

Be aware that no encryption will be used, so traffic can me sniffered, and if by any reason the connection be dropped, you're gonna have to start it all over again.

Personally, I would keep up with rsync.

Stefano Martins
  • 1,221
  • 8
  • 10
0

The performance of scp may be caused by two factors:

  • The cost of encryption, which may be too much for slow processors,
  • The number of round trips necessary to request a single file.

For a large number of small files it is the second. My first idea was to advise using sftp, since it uses a different program on the remote host. But after a test this gives a similar performance.

The solution that works is to use a program, which is installed on both the client and the server and sends only one stream of data back (instead of thousands of ssh channels):

  • If you have rsync on the remote server (and you install it on the client) use:

    rsync -av -e "ssh -oProxyJump=user@login.example.net" user@node.example.net:/home/user/my_files .
    
  • Otherwise you can use tar:

    ssh -o ProxyJumbp=user@login.example.net -e none user@node.example.net \
    tar -c /home/user/my_files | tar -x
    
Piotr P. Karwasz
  • 5,748
  • 2
  • 11
  • 21