0

I have a JS object of very large size that I can get through the curl, which I need to break apart for further writing to MYSQL. But when I try to walk through it with the forech cycle, I encounter a problem of not enough memory_size. Are there any options to handle the array with less load on the server without add memory size in php? Here is my code but it does not work

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 20480 bytes)

<?php
#ini_set('memory_limit', '-1');
    if( $curl = curl_init() ) {
    curl_setopt($curl, CURLOPT_AUTOREFERER, TRUE); 
    curl_setopt($curl, CURLOPT_HEADER, 0); 
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1); 
    curl_setopt($curl, CURLOPT_URL, $url); 
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, TRUE);  
    curl_setopt($curl,CURLOPT_URL,'http://img.combats.com/auction/dump.js');
    curl_setopt($curl,CURLOPT_RETURNTRANSFER,true);
    $out_shortinfo = curl_exec($curl);
    $json = json_decode(str_replace('var auction_dump=', '', $out_shortinfo), true);
    foreach ($json as $key=>$value) {
    if($key > 0)
        {
            echo $value['name'].' ';
            echo $value['txt'].' ';
            echo $value['_auc']['id'].'<br>';
        }
    }
    curl_close($curl);
  }
?>
  • how large is "very large" exactly? Have you considered writing it to a file first and then reading it a line at a time? – ADyson Nov 12 '18 at 19:29
  • just open link http://img.combats.com/auction/dump.js – Влад Мазур Nov 12 '18 at 19:31
  • I'm not going to open some random link, sorry. I was just interested in the file size. I don't need to see the data for you to tell me that. – ADyson Nov 12 '18 at 19:32
  • Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 20480 bytes) – Влад Мазур Nov 12 '18 at 19:33
  • that's the error, good, thankyou - please add it to your question, not comments. – ADyson Nov 12 '18 at 19:34
  • Now again, how big is the file? – ADyson Nov 12 '18 at 19:34
  • i dont know, its file not at my server – Влад Мазур Nov 12 '18 at 19:37
  • so? the cURL response code can tell you the size of the response. Or you can just download it yourself and see. – ADyson Nov 12 '18 at 19:39
  • If you cannot increase the memory size, if it were me, I would do it in batches. – Kenny Grage Nov 12 '18 at 19:40
  • The problem with large JSON files is that json_decode can't process only part of it. It reads the whole thing into memory or fails. So you need something else besides json_decode to deal with it. Some suggestions for handling it with event-based/pull parsers here: https://stackoverflow.com/questions/4049428/processing-large-json-files-in-php – Don't Panic Nov 12 '18 at 19:43
  • Its about 7.26 MB – Влад Мазур Nov 12 '18 at 19:45
  • does the script do anything else? you have a memory limit of approx 32MB, so a 7MB file shouldn't really cause much of a problem. Exactly which line of code is throwing the error? You could try closing the cURL request as soon as you've allocated the response into a variable and see if that helps. – ADyson Nov 12 '18 at 19:48
  • The output buffer will also use memory. Do you still run out of memory if you don't echo anything? (Assuming you're just doing that for testing purposes at this point since you mentioned writing to db.) – Don't Panic Nov 12 '18 at 19:58
  • @ADyson closing the cURL request after $out_shortinfo = curl_exec($curl); does not fix the problem. Its happend when i trying forech cycle – Влад Мазур Nov 12 '18 at 20:04
  • @Don't Panic That's right, for starters, I'm just trying to display everything on the screen. Is it worth trying to write data to the database right away? ADD: without echo i have same problem – Влад Мазур Nov 12 '18 at 20:06
  • 1
    Well, just for a quick check you can try just iterating through the data without echoing anything to see if you still run out of memory. But it's still possible that just decoding the JSON file consumes all the memory. 7.26 MB is the file size, but expanding that into a PHP object/array structure will consume much more memory. – Don't Panic Nov 12 '18 at 20:10
  • 1
    @ADyson excuse me, the amendment, the problem with memory occurs when $json = json_decode(str_replace('var auction_dump=', '', $out_shortinfo), true); – Влад Мазур Nov 12 '18 at 20:13
  • @ADyson do you really think this guy has some $10,000+ 0-day chrome/firefox/whatever-browser-youre-using exploit to burn on infecting random people on stackoverflow.com ? – hanshenrik Nov 13 '18 at 07:42
  • @hanshenrik No, probably not (but maybe he uploaded it to some 3rd party site which has some sort of malware) but actually my point was that providing the file itself is unnecessary in order to tell us how big it is - that was the question I had asked. – ADyson Nov 13 '18 at 09:07

1 Answers1

0

try this one bro

        echo $value['name'].' ';
        echo $value['txt'].' ';
        echo $value['_auc']['id'].'<br>';
        @ob_flush(); @flush();
j3thamz
  • 66
  • 4