1

i have a text file that contains a list of urls for files of my word

for example :

http://domain.com/file1.zip
http://domain.com/file2.zip
http://domain.com/file3.zip
http://domain.com/file4.zip

...etc

how can i batch download all files to a folder automatically from linux shell command

thanks for help

womble
  • 96,255
  • 29
  • 175
  • 230
EzzDev
  • 113
  • 4

4 Answers4

9

From man wget:

You have a file that contains the URLs you want to download? Use the -i switch:

wget -i <file>

James Polley
  • 2,089
  • 15
  • 13
1
for i in `cat /file/list`

do

    wget $i

done

(those are back ticks in the "cat /file/list" (on the same key as the tilda)

theotherreceive
  • 8,365
  • 1
  • 31
  • 44
aspitzer
  • 977
  • 5
  • 14
1

You could also use xargs:

$ cat /path/to/list | xargs -n1 wget 

or, using seq to download file1.zip to file10.zip:

$ seq 1 10 | xargs -n1 -i wget http://domain.com/file{}.zip

[edit] or, as another poster pointed out: $ wget domain.com/file{1..10}.zip

which is nicer than the seq method, given that certain OSes don't have seq by default (Mac OS X, Solaris)

user30579
  • 298
  • 1
  • 7
0

This command downloads file1.zip to file10.zip:

for i in $(seq 1 10);  do wget -nv http://domain.com/file${i}.zip ; done
SamK
  • 1,356
  • 3
  • 14
  • 28
  • if they were sequential, you could just do wget http://domain.com/file{1..10}.zip curl also has good support for file ranged, but I always forget the syntax. – Justin Jan 02 '10 at 19:23