3

I've a PHP script that it's being called with a cron job in my server to run uploaded video conversions. It works fine for some videos, but when the video is a bit larger (21MB for example) I get a 500 Internal Server Error and no other output.

I think that it's possible that this problem was due to timeouts so I've added set_time_limit(9000) and also ini_set('max_execution_time', 9000) to prevent this, but this does not solve anything.

I execute ffmpeg using:

$cmdOut = shell_exec ('ffmpeg -y -i [....] 2>&1'); // [....] is the rest of the command, it works fine with other videos, so i assume that it works ok.
echo print_r($cmdOut);

However there is no output, and the following lines are not being executed, so after the shell_exec the script is aborted.

Looking at the apache error_log I can see this line:

[Wed Jan 12 00:12:46 2011] [error] [client xx.xxx.xxx.xxx] Premature end of script headers: index.php

But there are no other clues. Can anyone help me?

For testing purposes i've created this PHP script:

<?php
set_time_limit(300);
sleep(120);
echo "SLEEP OUT";
?>

This script causes a "500 Internal Server Error" when i call it from my web browser, so i suppose that set_time_limit is not working. If i put sleep(30) it works and it returns the text SLEEP OUT. So the question is, how can i avoid the timeout error to execute a PHP script that is taking 5 or 10 minutes to complete??

NOTE: Server is CentOS running apache and php as FastCGI module.

FidoBoy
  • 392
  • 6
  • 18
  • 1
    Why do you call this script from a webserver? Can't you just use CLI? It would be more appropriate for video conversion. – netcoder Jan 11 '11 at 23:56
  • @Yehonatan - now now, we don't speak to other users like that. Be nice, be civil. – Kev Jan 12 '11 at 00:03
  • @Kev: Yes, but he's getting 500 HTTP response code, which means it was executed from a web server. – netcoder Jan 12 '11 at 00:06
  • I call the script from the webserver cause it works that way. The script is working nicely when i upload smaller video files. The script processes the video files queue – FidoBoy Jan 12 '11 at 00:06
  • Maybe there's not enough memory, what is memory_limit set to? – Dr.Molle Jan 12 '11 at 00:08
  • memory_limit is 128M also the 500 HTTP response code is because i'm calling the script from my browser to test it, this way i don't need to wait for the cronjob to be executed – FidoBoy Jan 12 '11 at 00:09
  • @netcode - apologies, that hit my blindspot. – Kev Jan 12 '11 at 00:10
  • @FidoBoy - so what happens if you just let the cron job run? – Kev Jan 12 '11 at 00:11
  • The extrange issue is that the output file is being created and it works (i've downloaded it with FTP and it works fine on my player) but after the shell_exec line the PHP script is being stopped, the echo on the next lines are not being processed, so i don't get any output to the browser... – FidoBoy Jan 12 '11 at 00:14
  • @Kev - If i let the cronjob run, i only get "Premature end of script headers: index.php" message into the apache error log – FidoBoy Jan 12 '11 at 00:15

2 Answers2

2

Finally i've solved this on my own. I've developed a workaround to bypass the timeout limitation in php. My solution is to execute the php script using the php-cli command with a scheduled cron job. This way i don't have the time limit when executing my script and it works nicely.

Thanks to all, specially to Phoenix for their time and ideas about this issue.

FidoBoy
  • 392
  • 6
  • 18
0

Do you really need the console output for anything? I ran into a similar issue once, even though I modified the primary php.ini itself to extend the execution time limit, it would still drop randomly when doing ffmpeg through exec. Wound up having to > /dev/null & it to stop it from dropping execution, then it worked fine regardless of what was thrown at it.

Phoenix
  • 4,488
  • 1
  • 21
  • 13
  • Do you want to mean that i must replace '2>&1' at the end of my command with '> /dev/null &'? i don't need the console output it's just to know what's wrong with the command... I've noticed also that even when i get 500 error in my web browser (it takes around 45 seconds to get that error), the process keeps running, because i can see the filesize of the output file being created is growing each time... – FidoBoy Jan 12 '11 at 00:25
  • I've added '> /dev/null &' in my command and result is the same, 500 HTTP error response :( – FidoBoy Jan 12 '11 at 00:29
  • Yeah, ` > /dev/null 2>&1 &` might be better. Basically tells it to route all error/std output (2>&1) to /dev/null (the black hole) and run it as a background task(&). Some people parse the console output to get various information about the file being processed, though I'd use http://ffmpeg-php.sourceforge.net/index.php for that, but it can't do conversions though, just information and screen caps, so still have to run the conversion through exec. – Phoenix Jan 12 '11 at 00:31
  • Also, try not setting it to a variable, just do the exec() on it's own, and try the `> /dev/null 2>&1 &` at the end instead of just `2>&1` – Phoenix Jan 12 '11 at 00:33
  • You can echo something else after the exec() in the loop to make sure it's starting up for each file. – Phoenix Jan 12 '11 at 00:38
  • i need to wait until task is completed (ffmpeg) because after that i need to execute another commands... – FidoBoy Jan 12 '11 at 00:42
  • right now i've created a simple php script just for testing purposes, it's like this: /dev/null 2>&1 &'); echo "\n\n"; print_r($tmp); echo "\n\n"; ?> – FidoBoy Jan 12 '11 at 00:46
  • ops, i forgot to say that this code doesn't work... the echo is not being executed... – FidoBoy Jan 12 '11 at 00:49
  • You should be be able to execute other commands even if it's not waiting to end. Even if you're trying to execute commands on the file being created by ffmpeg (but I wouldn't recommend that). If you have to gather information about the movie, try gathering it from the original file. If you're worried about creating too many instances of ffmpeg try using sleep(). When I was doing it, it was part of an upload script that would take files and convert them as they were uploaded, so I had time inbetween each file successfully being uploaded and ffmpeg triggering for each file. – Phoenix Jan 12 '11 at 00:54
  • Try using exec() rather than shell_exec() – Phoenix Jan 12 '11 at 00:55
  • Yes, and the command is being executed, as i've said before i get the output file and it works fine, problem is that the PHP script is aborted, i'm thinking now that problem could be that i'm executing php as FastCGI module, could it be the problem? can i change the timeout for FastCGI from PHP or do i need to change the timeout for all scripts (i hope not)? – FidoBoy Jan 12 '11 at 00:57
  • I doubt that's the problem. I even tried changing httpd.conf, thinking that the various settings dealing with timeouts there might help, but no luck, would still give me the 500 error. – Phoenix Jan 12 '11 at 00:59
  • Try this: ` /dev/null 2>&1 &'); echo 'Started'; ?>` – Phoenix Jan 12 '11 at 01:03
  • same result... 500 Internal Server Error – FidoBoy Jan 12 '11 at 01:11
  • wait... it works now, but ffmpeg is being executed in background and i need to wait until it has finished to execute the rest of my script... – FidoBoy Jan 12 '11 at 01:22
  • Hmm. Well, you could do `> /dev/null 2>&1 & echo $!`, and then call exec with a second argument like `exec('/usr/bin/ffmpeg etc etc etc',$pid);` which will return the pid of ffmpeg in $pid[0] due to the `echo $!`, then you can do a loop with something like `exec('ps -p ' . $pid[0],$running);` in it and check $running for the output of ps to see if the process is still running or not. – Phoenix Jan 12 '11 at 02:01
  • However, I've tried changing every ini, conf, etc. setting, going through .htaccess, everything available in an effort to get it to work, and none of it worked without the script just dieing on files bigger than 15 meg or so, sometimes even smaller. Spent days troubleshooting, searching Google, even went to Yahoo to see if they had anything that Google missed. Eventually I just had to make it so it didn't care whether or not it actually finished. Grabbing the pid and doing a loop to check whether or not it's running might work. – Phoenix Jan 12 '11 at 02:06
  • There's actually apparently a rather nice class posted by someone on the exec manual page http://www.php.net/manual/en/function.exec.php#88704 that does the > /dev/null 2>&1 & echo $! for you, grabs the pid, can check the running status, etc. – Phoenix Jan 12 '11 at 02:15
  • Downvote for not showing example how to use "> /dev/null &", sick of those incomplete answers! –  Mar 22 '22 at 07:49