1

I extract from a database table a set of almost 1500 data, and for each of this data I should call an endpoint through CURL in this way:

for($i=0; $i <1500; $i++) {
    $headers = [
     'Host: www.hostname.it',
     'Accept: application/json, text/javascript, */*; q=0.01',
     'X-Requested-With: XMLHttpRequest',
     'Accept-Language: it-it',
     'Content-Type: application/x-www-form-urlencoded; charset=UTF-8',
     'Origin: https://www.desiderimagazine.it',
     'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_4) AppleWebKit/603.1.30 (KHTML, like Gecko) Version/10.1 Safari/603.1.30',
     'Connection: close',
     'Referer: https://www.hpstname.it/page/registrazione?utm_source=gate2000&utm_medium=display&utm_campaign=ghh_gen18_regist&utm_content=leadcampaign2',
     'Content-Length: '.mb_strlen($post_fields, '8bit')
    ];  


    $ch = curl_init($endpoint);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch,CURLOPT_ENCODING , "");
    curl_setopt($cu, CURLOPT_USERAGENT, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/601.6.17 (KHTML, like Gecko) Version/9.1.1 Safari/601.6.17");
    curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
    curl_setopt($ch, CURLOPT_POST, 1);
    curl_setopt($ch, CURLOPT_TIMEOUT, 24000);
    curl_setopt($ch, CURLOPT_CONNECTIONTIMEOUT, 24000);
    curl_setopt($ch, CURLOPT_POSTFIELDS,$post_fields);  //Post Fields

    $result = curl_exec($ch);

    // .. here I save the result on database
}

I run the script inserting its url on the browser and it works fine ( I correctly see the results on the database and the endpoint response) for the first 20-30 data, more or less. After that I sistematically get a

504 error - Gateway error timeout

I suspect it could be the way I execute it, but there must be some configuration I can change on my code in order to fix it.

Thanks

user2342558
  • 5,567
  • 5
  • 33
  • 54
frabis
  • 101
  • 2
  • 14
  • It's possible that the server is interpreting the multiple calls as a denial of service (DOS) attack or simply getting otherwise bogged down with too many simultaneous requests. It would be preferable for the server to accept ALL the data in a single large post, rather than 1500 separate requests. If that's not possible, you might try pausing for a short time (milliseconds) between individual requests. – rmirabelle Jan 26 '18 at 16:37
  • 1
    why not use curl_multi_exec – coder Jan 26 '18 at 16:38
  • @coder yes, that was all what I needed apparently... thanks! – frabis Jan 26 '18 at 16:41

0 Answers0