I have a Python script which converts a PDF file. It is called via PHP in my Laravel app:
$command = escapeshellcmd("python /home/forge/default/pdf.py " . $id);
$output = shell_exec($command);
This works fine with any PDF up to 250MB, but fails with a larger PDF, for example 500Mb.
If I call the Python script directly from the command line, it works fine and completes in around 5 minutes. It is only when called by shell_exec
that it fails.
This is happening in a Laravel queued job, so as far as I know is not using HTTP/PHP FPM, but the command line, which should have no timeout?
The Laravel queue worker is running with timeout set to 0 (no timeout).
Is there anything else in PHP CLI settings which could be causing this to fail? Does anything know where errors would be recorded - there's nothing in failed_jobs
tables, nothing in laravel.log
, and nothing is caught by my Bugsnag integration.
As it runs OK from the command line, I'm guessing it's not a Python issue but something to do with calling it from PHP.
The server has 60Gb of RAM, and watching the process via htop
, it never get above 3% of RAM usage. Could there be some other hard coded RAM limit?
I'm using Laravel 5.4, Ubuntu server, Python 2.7.12.