A hypothetical question for you all to chew on...
I recently answered another question on SO where a PHP script was segfaulting, and it reminded me of something I have always wondered, so let's see if anyone can shed any light on it.
Consider the following:
<?php
function segfault ($i = 1) {
echo "$i\n";
segfault($i + 1);
}
segfault();
?>
Obviously, this (useless) function loops infinitely. And eventually, will run out of memory because each call to the function executes before the previous one has finished. Sort of like a fork bomb without the forking.
But... eventually, on POSIX platforms, the script will die with SIGSEGV (it also dies on Windows, but more gracefully - so far as my extremely limited low-level debugging skills can tell). The number of loops varies depending on the system configuration (memory allocated to PHP, 32bit/64bit, etc etc) and the OS but my real question is - why does it happen with a segfault?
- Is this simply how PHP handles "out-of-memory" errors? Surely there must be a more graceful way of handling this?
- Is this a bug in the Zend engine?
- Is there any way this can be controlled or handled more gracefully from within a PHP script?
- Is there any setting that generally controls that maximum number of recursive calls that can be made in a function?