0

One logical part of my C++ program has to fetch text files calling curl (linux program, not a library) and getting its output via pipe. There may be many urls to download. And download time may be reasonable. The program is meant to be "scalable" and effective.

So the question is what variant is preferable:

  1. Run single instance of curl giving it a list of urls
  2. Create a pool of threads each calling a curl process program with single url and subsequent aggregation of thread outputs.

In other words is curl asynchronous when downloading several urls or I have to implement it manually.

DimG
  • 1,641
  • 1
  • 16
  • 23

1 Answers1

0

If you are calling cURL from pipes then the preferred solution would be creating a pool of threads. Since the threads are independent from each other, they fetch multiple URLs at once.

However, another efficient solution would be to use the cURL library instead of pipes. Here is an example from libcurl.