I'm trying to wget an entire site to transfer from Server A to Server B. It works, but incredibly slowly.
I tested this by uploading a 100mb psd file to my server, and grabbing it with wget ftp. Transfer speeds were 26.76mb/sec.
But when I try and grab the whole site (minus the psd file) it takes 5 minutes to transfer 92mb of html files. It's a wordpress install with a theme and images basically.
Is this because it opens a new connection for each file? I tried doing this with php's ftp library, but that took just as long if not longer.
scp, ssh are not an option because 98% of the time I don't own the server, so I have to work with some sort of ftp. I'm using:
wget -r ftp://user:pass@domain.com/dir is what I'm using.