I'm using cURL to get some rank data for over 20,000 domain names that I've got stored in a database.
The code I'm using is http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading.
The array $competeRequests is 20,000 request to compete.com api for website ranks.
This is an example request: http://apps.compete.com/sites/stackoverflow.com/trended/rank/?apikey=xxxx&start_date=201207&end_date=201208&jsonp=";
Since there are 20,000 of these requests I want to break them up into chunks so I'm using the following code to accomplish that:
foreach(array_chunk($competeRequests, 1000) as $requests) {
foreach($requests as $request) {
$curl->addSession( $request, $opts );
}
}
This works great for sending the requests in batches of 1,000 however the script takes too long to execute. I've increased the max_execution_time to over 10 minutes.
Is there a way to send 1,000 requests from my array then parse the results then output a status update then continue with the next 1,000 until the array is empty? As of now the screen just stays white the entire time the script is executing which can be over 10 minutes.