2

I was using wget to download a file,like this: wget link/file.zip the file.zip was about 100M, but I just receive 5552B: enter image description here

and whatever I downloaded(large than 5552B, from some other hosts), I just receive 5552B! The HTTP response header content-length was 5552B too!
I was using Ubuntu14.04, Are there some network configurations to solve this problem?

Thank you very much!!

SunnyMarkLiu
  • 93
  • 1
  • 3
  • 10
  • 3
    There's a lot of characters in your URL that is interpreted by the shell (such as the & character). Place your entire URL inside single quotes: `wget 'http://sdlc..'` and try again. – nos Dec 14 '16 at 08:58
  • Thanks! It doesn't work for me. Actually, I was using java code to download file,like this:` URLConnection urlConnection = new URL(this.remoteFileUrl).openConnection(); System.out.println("contentlength="+urlConnection.getContentLength());` the 'contentlength' is still 5552B! – SunnyMarkLiu Dec 14 '16 at 09:06
  • Then look inside the downloaded file, it might be an HTML page telling you that you forgot to accept the Terms of Use, or a captcha or anything else preventing an automated download. – nos Dec 14 '16 at 09:18
  • I move to another machine which is also Ubuntu14.04, and download the same link, `wget` works perfectly! So it doesn't prevent automated downloading. Thanks a lot! – SunnyMarkLiu Dec 14 '16 at 09:27

0 Answers0