I have a file that has all the urls from which I need to download. However I need to limit one download at a time. i.e. the next download should begin only once previous one is finished. Is this possible using curl? Or should I use anything else.
Asked
Active
Viewed 4.7k times
22
-
3Hello and welcome to serverfault. When asking questions on this site, please always remember that we aren't in your place and cannot guess what environment you're using. In this case, you didn't specify what OS you're running which will make answering you properly hard. – Stephane Sep 20 '13 at 09:18
4 Answers
33
xargs -n 1 curl -O < your_files.txt

Grumdrig
- 491
- 5
- 10
-
4This is the best answer. Although the asker didn't specify, it's probably safe to assume the responses for all the URLs should be written to individual files. Use the `-O` option with cURL to do that. `xargs -n 1 curl -O < your_file.txt` – Mr. Lance E Sloan Jun 01 '16 at 15:51
-
-
27
wget(1)
works sequentally by default, and has this option built in:
-i file
--input-file=file
Read URLs from a local or external file. If - is specified as file, URLs are read from the standard input. (Use ./- to read from a file literally named -.)
If this function is used, no URLs need be present on the command line. If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If
--force-html is not specified, then file should consist of a series of URLs, one per line.
However, if you specify --force-html, the document will be regarded as html. In that case you may have problems with relative links, which you can solve either by adding "<base href="url">" to the documents
or by specifying --base=url on the command line.
If the file is an external one, the document will be automatically treated as html if the Content-Type matches text/html. Furthermore, the file's location will be implicitly used as base href if none was
specified.

dawud
- 15,096
- 3
- 42
- 61
-
5Since the asker wanted to know how to do this using cURL, you should at least include a solution that attempts to use it. – Mr. Lance E Sloan Jun 01 '16 at 15:48
4
This is possible using curl within a shell script, something like this but you'll need to research appropriate options for curl etc for yourself
while read URL
curl some options $URL
if required check exit status
take appropriate action
done <fileontainingurls

user9517
- 115,471
- 20
- 215
- 297
-
2I understand this is half pseudocode but I think that while loop should still have a "do". – nwk Sep 19 '14 at 08:55
-
1
-
What if an URL contains ampersands? Will they be escaped? Without escaping the shell will think that the command should be run in background. – Jagger Dec 15 '14 at 11:34
3
Based on @iain answer, but using proper shell scripting -
while read url; do
echo "== $url =="
curl -sL -O "$url"
done < list_of_urls.txt
Will also work with weird characters like ampersands, etc...
Can replace the -O
with a redirect into a file instead, or whatever is suitable.

Evgeny
- 599
- 5
- 10