1

I have used a multi curl library for PHP that facilitates fetching multiple pages in parallel (basically an easy to use API).

My Scenario: Fetch user data from API , process it and store results. All those users whose data have to be fetched are place in queue. This whole fetching , processing & storing result will take almost 8 - 10 min. And its really costly if I process it synchronously. So I have used php curl library for multi-threading. Its works fine if I run it in browser but since its cron-job so I have to run same script using command line. When I do so ; it will not work. Can anybody help me? Thanks in advance.

Psuedo Code:

$query = " Fetch users based on certain criteria LIMIT 200" ; 
$result = execute-query ;

$curl_handle = curl_multi_init();
$i = 0;
$curl = array();

while ($row = mysql_fetch_assoc($result)) {    
    $curl[$i] = add_handle($curl_handle, API_CALL);
}

exec_handle($curl_handle);

for ($j = 0; $j < count($curl); $j++)//remove the handles
    curl_multi_remove_handle($curl_handle, $curl[$i]);

curl_multi_close($curl_handle);

// Reference url http://codestips.com/php-multithreading-using-curl/

PeeHaa
  • 71,436
  • 58
  • 190
  • 262
anasanjaria
  • 1,308
  • 1
  • 15
  • 19

0 Answers0