0

I'm trying to make a program that can fetch multiple URLs at once. I'm using this example code of libcurl and libuv http://curl.haxx.se/libcurl/c/multi-uv.html

When I compile it and pass the program a few URLs such as

/curl_fetch google.com yahoo.com facebook.com

it works fine and I get results instantly. However, when I pass more arguments, for instance 100 URLs, nothing is returned at all for several minutes. Is there a reason it locks up when trying to fetch multiple pages in parallel?

cwings
  • 51
  • 5

1 Answers1

1

For transfers to be truly parallel you need to use a libcurl that was built to support asynchronous name resolves. It needs to have been built with the threaded-resolver or c-ares. The stock resolver is synchronous so a slow DNS resolve will block all the simultaneous transfers and if you add hundreds of transfers chances are a few of them will have slow name resolves...

Daniel Stenberg
  • 54,736
  • 17
  • 146
  • 222