I want to download some pages from a website and I did it successfully using curl
but I was wondering if somehow curl
downloads multiple pages at a time just like most of the download managers do, it will speed up things a little bit. Is it possible to do it in curl
command line utility?
The current command I am using is
curl 'http://www...../?page=[1-10]' 2>&1 > 1.html
Here I am downloading pages from 1 to 10 and storing them in a file named 1.html
.
Also, is it possible for curl
to write output of each URL to separate file say URL.html
, where URL
is the actual URL of the page under process.