0

I need to run around 20K curl request, fetch the data from the request and save it in excel using Spout. Important is that I need to be run in sequence and save data in sequence.

I have set the PHP timeout to 0, but it still fails. Shows an empty page with no warning and no success message and no product is saved in xls file. If I run it for 1000 products or so then it works fine. How can I fix this?

After getting the XML by sending the curl I am doing a foreach to get the required data and

$writer->openToFile('products.xlsx');

foreach ($xml->Body->product as $products) {
   $values = [ $products ];
   $rowFromValues = WriterEntityFactory::createRowFromArray($values);
   $writer->addRow($rowFromValues);
}

$writer->close();

function getCurl($url, $soapBody) {
    $curl = curl_init();
    curl_setopt_array($curl, array(
        CURLOPT_URL => $url,
        CURLOPT_RETURNTRANSFER => true,
        CURLOPT_ENCODING => "",
        CURLOPT_MAXREDIRS => 10,
        CURLOPT_TIMEOUT => 30,
        CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
        CURLOPT_CUSTOMREQUEST => "POST",
        CURLOPT_POSTFIELDS => $soapBody,
        CURLOPT_HTTPHEADER => array(
            "Content-Type: text/xml",
            "cache-control: no-cache"
        ),
    ));
    $response = curl_exec($curl);
    $err = curl_error($curl);
    if ($err) {
        echo "cURL Error #:" . $err;
    } else {
        curl_close($curl);
        return $response;
    }
}
kannu
  • 129
  • 2
  • 8
  • Can you run your program 20 times with 1000 requests each, then join the results together? Perhaps look at [array_chunk](https://www.php.net/manual/en/function.array-chunk.php) to split up your requests. – mark_b Mar 19 '20 at 09:41
  • Aren't you overloading your server with 20k requests sent all (almost) at once? Check reponse status code when you get an empty page - isn't it a 503 or something? – pstryk Mar 19 '20 at 09:44

0 Answers0