39

I try to run PHPUnit Tests in my new machine and I get this error:

PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114

The same code on the old machine run well...

New machine environment: PHP Version: PHP 5.3.21 (cli) Older: PHP 5.3.14

PHPUnit output every time:

................EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE 65 / 66 ( 98%)
E

Time: 34 seconds, Memory: 438.50Mb

There were 50 errors:

1) XXXXXXXXXXX
PHP Fatal error:  Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114
Mauro
  • 1,447
  • 1
  • 26
  • 46

10 Answers10

52

This can be a limitation on the server where the code is running. Every operating system only allows for a certain number of open files/handles/sockets. This limit is usually further reduced when the server is virtualized. On a Linux server you can check the current limit with ulimit -n, if you have root access you can increase it with the same command. I assume there is a method for Windows server as well. Otherwise there is not much you can do about it (except ask your hoster or administrator to increase it).

More configurable limits:

Change in /etc/security/limits.conf

soft nofile 1024
hard nofile 65535

Increase ulimit by ulimit -n 65535 or echo 65535 > /proc/sys/fs/file-max or in /etc/sysctl.conf:

fs.file-max=65535
nikos.svnk
  • 1,375
  • 1
  • 13
  • 24
Gerald Schneider
  • 17,416
  • 9
  • 60
  • 78
  • 5
    $ ulimit -n > 256. I'm on OSX – Mauro Feb 07 '13 at 10:41
  • 1
    @Mauro I get 4864 on Mavericks (OSX 10.9) – JamesHalsall Jul 11 '14 at 15:25
  • @JamesHalsall I'm using 1024 now – Mauro Jul 18 '14 at 11:45
  • 2
    In my case, this occurred no matter how high I set `ulimit -n` (I even tried 1,000,000). The problem turned-out to be bad logic in the PHP classes I was testing. For what it's worth, I was using Laravel and had a Factory State that returned a non-scalar value without using the closure syntax; I fixed the issue by using the closure syntax, and then a `ulimit -n` of `2048` was plenty. – Ben Johnson May 31 '19 at 12:48
34

How can you up file open limit (Linux or Max OS):

ulimit -n 10000

Solves problem with phpunit or/and phpdbg and Warning: Uncaught ErrorException: require([..file]): failed to open stream: Too many open files in [...]

pablorsk
  • 3,861
  • 1
  • 32
  • 37
5

In php, before the execution, try this

exec('ulimit -S -n 2048');
  • 1
    This doesn't work for me. Tried to add this line in both `setUp()` and `setUpBeforeClass()` methods. Executing it manually on the terminal before launching unit tests does work however. – Koen May 23 '17 at 11:57
  • @Koen Same for me. I guess the PHP process doesn't have permission to change this on the OS level. – Pelmered Nov 01 '22 at 11:05
3

After 'waking' my computer from sleep mode I ran into this problem.

Restarting php-fpm like so fixed it. Classic turn it off & back on again solution.

sudo /etc/init.d/php-fpm restart

I think this may be related to xdebug which I recently added to php.

Lando
  • 419
  • 3
  • 8
2

Don't store DirectoryIterator objects for later; you will get an error saying "too many open files" when you store more than the operating system limit (usually 256 or 1024).

For example, this will yield an error if the directory has too many files:

<?php 
$files = array(); 
foreach (new DirectoryIterator('myDir') as $file) { 
    $files[] = $file; 
} 
?>

Presumably, this approach is memory intensive as well.

source: http://php.net/manual/pt_BR/directoryiterator.construct.php#87425

user1666651
  • 229
  • 1
  • 2
  • 9
1

on server debian you can go to also to

/etc/php/php7.xx/fpm/pool.d/www.conf

rlimit_files = 10000

/etc/init.d/php7.xx restart
Hovercraft Full Of Eels
  • 283,665
  • 25
  • 256
  • 373
Kamil Dąbrowski
  • 984
  • 11
  • 17
1

I've noticed this occur in PHP when you forget to wrap something in a closure. Carefully look at your recent diffs and you might be able to get to the bottom of this (in my case, I referenced $faker in a Laravel PHP unit factory without having a closure.

Jack Kinsella
  • 4,491
  • 3
  • 38
  • 56
1

I experience this error in relation to a Http pool, where I added too many urls into the pool (around 2000 urls).

I had to chunk the urls into smaller batches, and the error stopped.

I think its how the Guzzle Pool works, it doesn't close the curl connections before the entire pool is done.

Ex.

$responses = Http::pool(function (Pool $pool) use ($chunk) {
    return collect($chunk)->map(fn($url) => $pool->get($url));
});

Becomes:

collect($urls)
    ->chunk(25)
    ->each(function ($chunk) {
        $responses = Http::pool(function (Pool $pool) use ($chunk) {
            return collect($chunk)->map(fn($url) => $pool->get($url));
        });
    });

The Http function is a wrapper function from Laravel using Guzzle Http Client. https://laravel.com/docs/9.x/http-client

Unicco
  • 2,466
  • 1
  • 26
  • 30
0

Maybe, you have some error with file /etc/init.d/phpx.x-fpm. Let’s restart it:

sudo /etc/init.d/php7.2-fpm restart
Tzar
  • 5,132
  • 4
  • 23
  • 57
0

I got this error, every time about a Redis library PHP was trying to load, but it was caused by something I didn't really think of at first. I kept getting this error when my program was running a while, doing a repetitive process. I found out I opened a cURL session ($ch = new curl_init(...)), which was closed in the destructor of a class, but that destructor was never called. I fixed that problem and the too-many-files open error disappeared.

patrick
  • 11,519
  • 8
  • 71
  • 80