2

I've learned that we can send multiple HTTP requests with CURL by doing this:

curl -I http://linuxbyexample.co.nr http://lne.blogdns.com/lbe

or this:

xargs curl -I < url-list.txt

How can we save the all responses we get - every one of them to a different file?

quanta
  • 51,413
  • 19
  • 159
  • 217
mrjames
  • 33
  • 1
  • 5

2 Answers2

2

You can use the -o command line option to write the output to a file instead of stdout. You can use multiple -os e.g.

curl -I http://linuxbyexample.co.nr lbe.co.nr.txt http://lne.blogdns.com/lbe -o lne.txt

If you format you urls-list.txt like so

http://serverfault.com -o serverfault.com.txt
http://example.com -o example.com.txt

it should work as you want.

user9517
  • 115,471
  • 20
  • 215
  • 297
0
$ cat urls-list.txt 
http://linuxbyexample.co.nr 
http://lne.blogdns.com/lbe

$ while read u; do \
    curl -I $u -o $(echo $u | sed 's/http:\/\///' | tr '/' '_').header; \
done < urls-list.txt

$ cat linuxbyexample.co.nr.header 
HTTP/1.1 200 OK
Date: Thu, 24 Nov 2011 03:15:19 GMT
Server: LiteSpeed
Connection: close
X-Powered-By: PHP/5.2.10
Content-Type: text/html
X-Powered-By: PleskLin
quanta
  • 51,413
  • 19
  • 159
  • 217