0

I have an API that I call about at least 10 times at the same time with different information.

This is the function I am currently using.

$mh = curl_multi_init();
$arr = array();
$rows = array();

while ($row = mysqli_fetch_array($query)) {
    array_push($arr, initiate_curl($row, $mh));
    array_push($rows, $row);
}
$running = null;
for(;;){
  curl_multi_exec($mh, $running);
  if(!$running){
     break;
  }
  curl_multi_select($mh);
  usleep(1);
}

sleep(1);

foreach($arr as $curl) {curl_multi_remove_handle($mh, $curl);}
curl_multi_close($mh);
foreach($arr as $key=>$curl) {
    $result = curl_multi_getcontent($curl);
    $dat = simplexml_load_string($result);
    check_time($dat, $rows[$key], $fp);
}

It works fine when the number of requests is small, but when it grows some of the curls do not bring back the appropriate data. i.e. they return null, and I am guessing because the server goes through before anything is happening..

what can I do to make this work? I am unexperienced in php or server and am having a hard time going through documents..

if I create another php file in which I curl to the API to do stuff with the data, and multi_curl that php file, would it work better? (because in that case it won't be that important that some of the calls do not return the data.. Would that overload my server constantly?

  • i think its not curl problem. its server problem. is your server capable to handle large amount of request at a time? check your server load during request and figure out what's wrong. if server is not that good then use throttling. – Shanteshwar Inde Dec 07 '18 at 04:38
  • I have the cheap server from hostinger with 1 processing power and 10G SSD.. do you think I should upgrade this? – Seong Min Choo Dec 07 '18 at 05:34
  • first analyse the problem. check server load during high request if its max out then upgrade server else might be this is other issue, so upgrading server not the solution. – Shanteshwar Inde Dec 07 '18 at 05:39
  • I do not know much about the server so can you please (Hopefully) explain if this is bad? my CPU usage is usually around 10~13% but rarely it reaches 100%. and my processes reaches 20 (20 Num? and I'm not sure if this is the limit but it is in red so probably) constantly.. – Seong Min Choo Dec 07 '18 at 05:46
  • yes, then this is server max out problem, but when is see your code i see there is for(;;) [infinite loop] is this necessary? – Shanteshwar Inde Dec 07 '18 at 06:22
  • well it basically runs until theres no more curl to execute in curl_multi – Seong Min Choo Dec 07 '18 at 06:45
  • remove the usleep in the download loop, that just makes your down slower, it does not reduce cpu usage. (and the cpu usage is ~0% during the download loop anyway.) – hanshenrik Dec 09 '18 at 16:03

1 Answers1

0

This is not curl issue, this is server utilisation issue so try upgrading server.

Useful links: link1 , link2

Note: Be careful when using infinite loop [for(;;)] and regularly monitor CPU utilisation.

Shanteshwar Inde
  • 1,438
  • 4
  • 17
  • 27
  • that infinite loop hardly use any cpu at all. the curl_multi_select() call sleeps while waiting for network IO. CPU utilization is not the issue here. – hanshenrik Dec 08 '18 at 11:13
  • If you read code carefully then u see usleep in for loop not sleep, sleep is outside loop. I believe that you know the difference between sleep and usleep. – Shanteshwar Inde Dec 09 '18 at 15:26
  • `curl_multi_select($mh);` does the sleeping in that loop. it sleeps until data from the network has been downloaded, which makes the cpu utilization very low (practically 0%). try this code if you don't believe me, check the cpu utilization of this code: ``` – hanshenrik Dec 09 '18 at 15:59