4

I have a Python script which converts a PDF file. It is called via PHP in my Laravel app:

$command = escapeshellcmd("python /home/forge/default/pdf.py " . $id);
$output = shell_exec($command);

This works fine with any PDF up to 250MB, but fails with a larger PDF, for example 500Mb.

If I call the Python script directly from the command line, it works fine and completes in around 5 minutes. It is only when called by shell_exec that it fails.

This is happening in a Laravel queued job, so as far as I know is not using HTTP/PHP FPM, but the command line, which should have no timeout?

The Laravel queue worker is running with timeout set to 0 (no timeout).

Is there anything else in PHP CLI settings which could be causing this to fail? Does anything know where errors would be recorded - there's nothing in failed_jobs tables, nothing in laravel.log, and nothing is caught by my Bugsnag integration.

As it runs OK from the command line, I'm guessing it's not a Python issue but something to do with calling it from PHP.

The server has 60Gb of RAM, and watching the process via htop, it never get above 3% of RAM usage. Could there be some other hard coded RAM limit?

I'm using Laravel 5.4, Ubuntu server, Python 2.7.12.

LF00
  • 27,015
  • 29
  • 156
  • 295
samiles
  • 3,768
  • 12
  • 44
  • 71

1 Answers1

0

it's because there is an execution time and memory limit for script, you can check in the php.ini. usually it max_execute_time = 30s. and memory limit default 128M

; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 300

and memory limit

; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 1280M
LF00
  • 27,015
  • 29
  • 156
  • 295
  • Hmm. I already had the memory set at 25000. Execution time was 30 - but I have many queue jobs taking 3-5 minutes without issue? I have just tried setting execution time to 900 (15 minutes) and it failed after around 5 minutes... – samiles May 18 '17 at 08:40
  • I have tried setting both `php-cli` and `php-fpm` .ini files to `memory_limit = 50000M` and `max_execution_time = 3600` but the script still dies at around 5 minutes :\ There's no errors, it just disappears from top and the rest of the queue job fails to run. – samiles May 18 '17 at 09:03