15

I'm using wget to mirror some files across from one server to another. I'm using the following command:

wget -x -N -i http://domain.com/filelist.txt

-x = Because I want to keep the directory structure

-N = Timestamping to only get new files

-i = To download a list of files from an external file, one on each line.

Small files such as one i'm testing that's 326kb big download just fine.

But another that is 5gb only downloads 203mb and then stops (it is always 203mb give or take a few kilobytes)

The error message shown is:

Cannot write to âpath/to/file.zipâ

(I'm not sure why there are the strange characters before and after. I am using Putty in Windows and this may or may not have something to do with it, so I left them in. I presume not though.).

The full response is as follows: (I have replaced paths, ip and domain name)

--2012-08-31 12:41:19-- http://domain.com/filelist.txt Resolving domain.com... MY_IP Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 161 [text/plain] Server file no newer than local file âdomain.com/filelist.txtâ

--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] The sizes do not match (local 213004288) -- retrieving.

--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] Saving to: âdomain.com/path/to/file.zipâ

3% [====>
] 213,003,412 8.74M/s in 24s

Cannot write to âdomain.com/path/to/file.zipâ

It doesn't seem to make any difference if the path directory already exists or is created on the fly.

Does anyone have any idea why it stopping and how I can fix it?

Any help with be most appreciated.

EDIT: I have also tried just doing a wget, no file input and renaming the file. This time it downloads a little over 3gb and then gives the same cannot write error.

wget -x -N http://domain.com/path/to/file.zip -O files/bigfile.zip
John Mellor
  • 310
  • 1
  • 3
  • 9
  • Do you have any special characters in your path? – JMeterX Aug 31 '12 at 12:21
  • Does it work as expected if you type "cd /tmp && " before the command? –  Aug 31 '12 at 12:24
  • 1
    Is your disk full? – Jenny D Aug 31 '12 at 12:26
  • The disk is definitely not full and there are no special characters. Although the length of the path is 87 characters, some Googling has shown some problems with long names (the file name is only 29 characters though) It fails in just the same way in tmp. – John Mellor Aug 31 '12 at 12:34
  • @FreezeDriedPop Since your file name is relatively long you could change using the `-O` option so `wget -O test.zip http://link` – JMeterX Aug 31 '12 at 12:37
  • I have tried doing this and without the file input and it downloads around 3gb, but gives the same "cannot write" error. – John Mellor Aug 31 '12 at 12:49
  • @FreezeDriedPop What filesystem are you transferring too? – JMeterX Aug 31 '12 at 14:30
  • It looks like there are utf-8 characters in the filename. This probably won't work on many filesystems. – meawoppl May 18 '15 at 03:35
  • This for me was a red herring. My wget client (apparently) on my mac was normalizing the URL to end with a '/' where as the one on the virt I was working with apparently didn't – Christian Bongiorno Feb 09 '19 at 01:14

9 Answers9

9

You will get this error if you are out of disk space. run df and you will see if the directory you're writing to is at 100%

Sun
  • 305
  • 1
  • 5
  • 12
5

It is a problem with long URL. I faced it too. So, I used bit.ly and shortened the url. Works like a Charm!

Namchester
  • 51
  • 1
  • 4
  • Are you sure? It seems unlikely that the download will start and break off at some point when the issue is related to the URL, which is used only at the very start of the transaction. – Felix Frank Jun 21 '14 at 12:57
  • Yes. I had the same problem. Try it. – Namchester Jun 22 '14 at 10:42
  • I think the problem is with linux not able to recognize the long url. – Namchester Jun 24 '14 at 18:59
  • I suppose you mean the shell? Because the kernel is most definitely innocent of this failure. Even for the shell it's not likely, but if it was the case, again, the download could not even start - the shell would error out before even forking the prospective `wget` process. – Felix Frank Jun 24 '14 at 20:58
  • 1
    For me it was query string on the url - `wget http://dltr.org/skin/frontend/lowes/default/css/custom.css?001` – Damodar Bashyal Feb 17 '17 at 05:10
  • Felix, sorry about that, you are correct. – Namchester Feb 21 '17 at 10:44
  • @DamodarBashyal the url works for me, check again? – Namchester Feb 21 '17 at 10:44
  • @Namchester It was not working on Centos7. After reading your original answer, I removed query string and it worked. so, up voted you :) FYI, I am using vagrant's Centos7 image. – Damodar Bashyal Feb 21 '17 at 22:54
2

I was getting same error on this command:

sudo wget -O - https://nightly.odoo.com/odoo.key | apt-key add -

the problem was sudo for second command and i solve it with:

sudo su
Bheid
  • 123
  • 5
1

First try:

cd ~

to get you in correct dir before you download with wget command

Gerald Schneider
  • 23,274
  • 8
  • 57
  • 89
stu
  • 11
  • 1
1

I was doing something similar to:

wget -x -N -i http://domain.com/filelist.txt

I was receiving:

--2016-12-09 07:44:23--  https://www.example.com/dir/details?abc=123&def=456
Resolving www.example.com (www.example.com)... 1.2.3.4
Connecting to www.example.com (www.example.com)|1.2.3.4|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
details?abc=123&def=456: No such file or directory

Cannot write to ‘details?abc=123&def=456’ (Success).

In my equivalent filelist.txt file I had a URL like:

https://www.example.com/dir/details?abc=123&def=456

So to debug I tried creating the same file wget was trying to create:

touch "details?abc=123&def=456"
touch: cannot touch ‘details?abc=123&def=456’: No such file or directory

Viola! It looks like the ? was the problem, but good practice would be to remove all special characters from the file names, imagine what the & will do if not escaped.

Sooth
  • 161
  • 1
  • 5
1

I just added a - to the tar command after the pipe after wget

I had

wget https://example.com/path/to/file.tar.gz -O -|tar -xzf -C /path/to/file

then changed it to

wget https://example.com/path/to/file.tar.gz -O - | tar -xzvf - -C /path/to/file
Shadi
  • 121
  • 5
0

I also got this error when accidentally trying to wget a file into a folder in Mac OS Trash. Moving the folder out fixed it, since Trash is read-only.

fizzybear
  • 101
0

If it starts saving a large file and writes 203 MB of it, I would suspect that you have either a full file system on the receiving end or the network connection is timing out.

You can use df -h on the receiving server to see if the filesystem is full

Check out this answer for timeout issues with wget:

https://stackoverflow.com/questions/2291524/does-wget-timeout

Also, re-try the transfer that failed and omit the -N timestamp option

Also, run ulimit -a to see if there is a file size limit on the receiving server

DisgruntledUser
  • 101
  • 2
  • 9
  • I'm no expert but I think it is running CentOS 6. Also i'm not sure how to check the character representation. Although it does start downloading and downloads other smaller files just fine, so I don't think that sounds like the problem. – John Mellor Aug 31 '12 at 12:48
  • Do the files it succeeds in downloading have ANY funny characters in their names? – DisgruntledUser Aug 31 '12 at 13:17
  • No, no strange characters at all, unless a dot is a strange character e.g. "megapack_4.11.zip". But again i've tried it just with the name "bigfile.zip" and the same problem occurs. – John Mellor Aug 31 '12 at 13:24
  • Perhaps it is only Putty that is set to a different char representation than UTF-8 – DisgruntledUser Aug 31 '12 at 13:27
  • Yeah I really don't think that's the problem, I just mentioned it as I was copy and pasting from Putty. The real problem the cannot write to. – John Mellor Aug 31 '12 at 13:37
  • Also make sure each server is set to the right time. I suggest putting both servers on UTC timezone and running NTP. Yo might want to re-try the failed download to a different location without the -N timestamp option – DisgruntledUser Aug 31 '12 at 13:43
0

Finally figured it out and it was a space issue. It's a problem with 1and1 Cloud Server's, more about it here: http://www.mojowill.com/geek/1and1-dynamic-cloud-server-disk-allocation/

John Mellor
  • 310
  • 1
  • 3
  • 9