I have a script in php that uses multi curl.
A general multicurl request consists of 50 individual curl handles. It takes some time for each request to return, so my script should just wait until processing is finished on the remote server.
Each of those 50 requests returns quite a lot of data (content) which I don't need. Therefor it would be more efficient to ignore the returned data. However I do need to know when processing on the remote server is finished, which is when data is returned.
The reason why I don't need to use the data, but do need to make the request is that the remote server puts data in a database, which I subsequently take from my own server. So I basically only need to make this request AND I need to know when the script on the remote server is finished.
My question: This request is consuming a lot of CPU how can I make it more efficient?
The code:
$nrofrequests=count($variable1);
//Build multi-curl for all to scrape sites at once:
for($i=0;$i<$nrofrequests;$i++){
$post='variable1='.$variable1[$i].'&variable2='.$variable2[$i];
$url='http://www.domain.com/'.$scriptnumber[$i];
$ch[$i] = curl_init($url);
curl_setopt ($ch[$i], CURLOPT_POST, 1);
curl_setopt ($ch[$i], CURLOPT_POSTFIELDS, $post);
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$i], CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch[$i], CURLOPT_TIMEOUT, 35);
set_time_limit(35);
}
// build the multi-curl handle, adding both $ch
$mh = curl_multi_init();
for ($i=0; $i<$nrofrequests; $i ++ ):
curl_multi_add_handle($mh, $ch[$i]);
endfor;
// execute all queries simultaneously, and continue when all are complete
$running = null;
do {curl_multi_exec($mh, $running);}
while ($running);
for ($i=0; $i<$nrofrequests; $i ++ ):
curl_multi_remove_handle($mh, $ch[$i]);
endfor;