3

Recently I started to face problems about memory management in PHP. This is something new to me because I never needed to run PHP scripts for long time on background threads. So, now I'm studying the subject and came to the simple script that follows:

function w()
{
    $f = fopen('test1.txt', 'w+');
    fclose($f);
    unset($f);
}

$i = 0;
$max = 5;

echo 'Memory usage:<br><br>';
echo $i . ' - ' . memory_get_usage() . '<br>';
touch('test1.txt');

while(++$i < $max)
{
    w();
    echo $i . ' - ' . memory_get_usage() . '<br>';
}

It only opens and closes the same file multiple times, and after each close it displays the memory used.

As you can see, even after closing and unset()'ing the handler, the memory doesn't drop. It seems that internally the pointer is still holding memory. I know, they are a few bytes, but even a few bytes can break the script if it is running on a background thread (that's my real purpose).

I tried setting $f = null but it consumes even more memory (I'm not crazy, check for yourself)! And gc_collect_cycles() also didn't work.

So my question is: is there a way to completely free the memory of a file handler?

Diego Andrade
  • 613
  • 7
  • 15
  • I just retried your script above without memory changing. It changed the first time, then after 261 times and then stayed static up to 5000 cycles. – Robbie Mar 11 '15 at 01:26
  • Yes I also noticed this, but in a real case I have many files being opened on each run. So the memory usage grows 'til the script breaks. What is strange to me is that even closing the handler the memory doesn't drop. – Diego Andrade Mar 11 '15 at 01:39
  • You likely weren't seeing memory drop because PHP is garbage collected, and the garbage collector only runs when you have idle CPU time (e.g. if you call `sleep()`) or if you run out of memory - whichever comes first. As you aren't out of memory and you aren't giving the CPU a break, the garbage collector isn't running so you aren't freeing the memory. – stevendesu Sep 04 '20 at 18:59

2 Answers2

0

You can fork to a child process and let the child open file.

That way, when the child ends, all its resources will be cleared by the PHP garbage collecting process.

Keep in mind, however that the child will have access to any resource created by the parent and he will also close them when he finishes.

Updated post :

Actually, after trying your script via command line (I just modified it to iterate a 100 times instead of 5), here is what I get :

Memory usage:
0 - 119384
1 - 119564
2 - 119564
3 - 119564
4 - 119564
5 - 119564
6 - 119564
7 - 119564
8 - 119564
9 - 119564
10 - 119564
11 - 119564
12 - 119564
13 - 119564
14 - 119564
15 - 119564
16 - 119564
17 - 119564
18 - 119564
19 - 119564
20 - 119564
21 - 119564
22 - 119564
23 - 119564
24 - 119564
25 - 119564
26 - 119564
27 - 119564
28 - 119564
29 - 119564
30 - 119564
31 - 119564
32 - 119564
33 - 119564
34 - 119564
35 - 119564
36 - 119564
37 - 119564
38 - 119564
39 - 119564
40 - 119564
41 - 119564
42 - 119564
43 - 119564
44 - 119564
45 - 119564
46 - 119564
47 - 119564
48 - 119564
49 - 119564
50 - 119564
51 - 119564
52 - 119564
53 - 119564
54 - 119564
55 - 119564
56 - 119564
57 - 119564
58 - 119564
59 - 119564
60 - 119564
61 - 119564
62 - 119564
63 - 119564
64 - 119564
65 - 119564
66 - 119564
67 - 119564
68 - 119564
69 - 119564
70 - 119564
71 - 119564
72 - 119564
73 - 119564
74 - 119564
75 - 119564
76 - 119564
77 - 119564
78 - 119564
79 - 119564
80 - 119564
81 - 119564
82 - 119564
83 - 119564
84 - 119564
85 - 119564
86 - 119564
87 - 119564
88 - 119564
89 - 119564
90 - 119564
91 - 119564
92 - 119564
93 - 119564
94 - 119564
95 - 119564
96 - 119564
97 - 119564
98 - 119564
99 - 119564
alfallouji
  • 1,160
  • 8
  • 12
  • Well, this is not currently an option because in my real application I do several file operations, and I cannot open child processes just to do this job. I'll end up creating to many threads. Aside of this, I want something that I can use even on shared web hostings, and we know that many of them are very restrictive about system resources. – Diego Andrade Mar 11 '15 at 01:45
  • But to say the truth @alfallouji, I'm already doing this. To make the main thread run longer, I call a child one to do the main work. But I still have a memory leak issue to fix. – Diego Andrade Mar 11 '15 at 01:46
0

There is no problem with memory and file handles: it must be elsewhere in your script. I ran the following version of your code (basically creates an array of 5000 .jpg files: clears down the memory used by the iterators and leaves just the array in memory. Then runs down that list of files based off your script above.)

<?php

$i = 0;
$max = 5000;

$aFiles = array();

$it = new RecursiveDirectoryIterator("d:/");
foreach(new RecursiveIteratorIterator($it) as $file) {
    if (strcmp(substr($file, -4), '.jpg') == 0) {
        if ($i++ > $max) {
            break;
        }
        $aFiles[] = $file;
    }
}
unset($it);
unset($max);

gc_collect_cycles();

$i = 0;
echo 'Memory usage:<br><br>';
echo $i . ' - ' . number_format(memory_get_usage()) . '<br>';

foreach($aFiles as $file) {
    $f = fopen($file, 'r');
    fclose($f);
    unset($f);
    echo ++$i . ' - ' . number_format(memory_get_usage()) . '<br>';
    flush(); // Trying to ensure that the output buffer doesn't add to memory
}

?>

Here are the (truncated) results:

Memory usage:

0 - 3,553,000
1 - 3,552,608
2 - 3,552,728
3 - 3,552,728
4 - 3,552,728
5 - 3,552,728
.....
4997 - 3,552,728
4998 - 3,552,728
4999 - 3,552,728
5000 - 3,552,728
5001 - 3,552,728

Conclusion: problem will lie elsewhere. Example: without the flush(), memory increases after about 1K of output as it adds a but more for output buffer. Can you ensure that nothing else is accounting for the memory being held open, e.g. are you calling the open function recursively, that will add to the stack trace each time?

Robbie
  • 17,605
  • 4
  • 35
  • 72