This works for me:
$ xargs -n 1 curl -O < urls.txt
I'm in FreeBSD. Your xargs may work differently.
Note that this runs sequential curl
s, which you may view as unnecessarily heavy. If you'd like to save some of that overhead, the following may work in bash:
$ mapfile -t urls < urls.txt
$ curl ${urls[@]/#/-O }
This saves your URL list to an array, then expands the array with options to curl
to cause targets to be downloaded. The curl
command can take multiple URLs and fetch all of them, recycling the existing connection (HTTP/1.1), but it needs the -O
option before each one in order to download and save each target. Note that characters within some URLs ] may need to be escaped to avoid interacting with your shell.
Or if you are using a POSIX shell rather than bash:
$ curl $(printf ' -O %s' $(cat urls.txt))
This relies on printf
's behaviour of repeating the format pattern to exhaust the list of data arguments; not all stand-alone printf
s will do this. If yours has problems, you might use another tool:
$ curl $(sed 's/^/-O /' < urls.txt)
Note that this non-xargs method also may bump up against system limits for very large lists of URLs. Research ARG_MAX and MAX_ARG_STRLEN if this is a concern.