0

In a php file I am reading the input stream which contains an image.

$incomingData = file_get_contents('php://input');
$fh = fopen($uploadPath, 'w');
fwrite($fh, $incomingData);
fclose($fh);

For small images this works just fine, for bigger ones which takes longer than 15 seconds or so i get a 502 bad gateway response.

The apache error log is saying:

child pid 1492 exit signal Segmentation fault (11)

I tried this but that did not work.

 ini_set('default_socket_timeout', 120);

But I am not sure of its a time out.

edit// CODE:

$uploadFilename = time();
$uploadPath = '/path/melvin.jpg';

$fhSrc = fopen('php://input', 'r');

// Valid data?
if($fhSrc) {

    $fhDst = fopen($uploadPath, 'w');

    while (($data = fread($fhSrc, 1024)) !== FALSE) {
        fwrite($fhDst, $data);
    }

    fclose($fhSrc);
    fclose($fhDst);

}

echo 'ok';

RAW HEADERS:

POST /test.php HTTP/1.1
Host: hi.com
User-Agent: secret/1.0 (unknown, iPhone OS 5.0.1, iPhone, Scale/2.000000)
Accept: */*
Accept-Language: nl, en, fr, de, ja, it, es, pt, pt-PT, da, fi, nb, sv, ko, zh-Hans, zh-Hant, ru, pl, tr, uk, ar, hr, cs, el, he, ro, sk, th, id, ms, en-GB, ca, hu, vi, en-us;q=0.8
Accept-Encoding: gzip
Settings: {SOMEJSON}
Content-Type: application/x-www-form-urlencoded
Cookie: CAKEPHP=2b82f748fb3a64063b2e3be9bdec5c11
Connection: keep-alive
Transfer-Encoding: Chunked
Pragma: no-cache
Cache-Control: no-cache

and here in the boy the Big image
hakre
  • 193,403
  • 52
  • 435
  • 836
Melvin
  • 3,421
  • 2
  • 37
  • 41
  • Does [`set_time_limit()`](http://php.net/manual/en/function.set-time-limit.php) help? – Salman A Feb 07 '12 at 14:42
  • check the php error log. I'm thinking you're hitting a memory limit because you buffer the image into a string. Regardless, A more efficient way would be to just `copy('php://input', $destination);` – goat Feb 07 '12 at 14:44
  • no, neither does max_input_time() or max_execution_time() – Melvin Feb 07 '12 at 14:47
  • @chris copy() does work but gives exact the same problem. Any other solutions? In the mean while im trying to read my php error log. – Melvin Feb 07 '12 at 15:05
  • the main point of my comment was that you need to check the php error log. Find it's location via [phpinfo](http://www.php.net/phpinfo)(); you may need to specify it via [ini_set](http://www.php.net/ini_set)() setting `error_log` to the location, and `log_errors` to On, and `error reporting` to E_ALL – goat Feb 07 '12 at 15:12
  • also, I know IIS has a configurable cgi response time directive, and if it takes too long it issues the same message. Maybe apache has the same, and yours is set to 15 seconds. – goat Feb 07 '12 at 15:14

1 Answers1

2

If you are getting a segfault, the issue here is nothing to do with timeouts and all about memory usage. As has been previously observed here, the way that PHP often deals with OOM errors on *nix is with a segfault. If you have a file that is taking 15 seconds to read, you have a seriously large file anyway, so it's not really surprising!

There are a couple of approaches to sorting this out. The first I will suggest is the simplest and does not involve messing around with any config. You can change your code to this and it should solve the problem:

$fhSrc = fopen('php://input', 'r');
$fhDst = fopen($uploadPath, 'w');
stream_copy_to_stream($fhSrc, $fhDst);
fclose($fhSrc);
fclose($fhDst);

If for whatever reason stream_copy_to_stream() is not available or gives you the same error, the quick and dirty alternative is:

while (($data = fread($fhSrc, 1024)) !== FALSE) {
  fwrite($fhDst, $data);
}

This approach avoids having to read the entire file data into PHP's memory space, and transfers the data directly from the web server buffer to the disk - since the read length is 2048 PHP should never need more the 2KB working memory to perform the operation.

Alternatively, you could alter the memory_limit directive in php.ini. I don't recommend this as an approach because, amongst other reasons, it will make your code less portable.

Community
  • 1
  • 1
DaveRandom
  • 87,921
  • 11
  • 154
  • 174
  • Both options are working with smaller files. But I keep getting the same error with bigger files. The limit is around 600.00/700.00 kb. I will edit my first post with the code I have now. I hope you can help me out. – Melvin Feb 14 '12 at 22:37
  • It definitely is not a time limit, i guess some PHP ini limit or so. But cant find which one. – Melvin Feb 14 '12 at 22:43
  • I guess there is no way you can change the format of the incoming message? If you can, I would be interested to see if you still get a problem with `multipart/form-data` file uploads and using the `$_FILES` array... Is there anything useful in syslog (as in the OS log files)? – DaveRandom Feb 15 '12 at 09:31
  • Also I notice that your message has a `Content-Type` of `application/x-www-form-urlencoded` and you say you are sending the raw image as the message body - this make no semantic sense, the `Content-Type:` of the request should be that of the image, e.g. `image/jpeg` – DaveRandom Feb 15 '12 at 09:33
  • Setting Content-type to image/jpeg worked. I can now send data larger than 1 MB. But its weird thought. Why would I get the Segmentation fault fould in directadmin for this? Thanks btw, you saved my live :P – Melvin Feb 15 '12 at 10:19
  • Honestly I have no idea why this causes a segfault problem - there is an argument that it could be considered a bug in the Zend engine - but it does makes sense that PHP would not like it in general. With `application/x-www-form-urlencoded` PHP will try and parse it to populate the `$_POST` array, but if the data is not parsable I can see that it may cause a problem. If you want to know exactly *what* problem you would have to dig through the PHP source code - at a guess I'd say it's running out of memory to fill an array key in `$_POST`. – DaveRandom Feb 15 '12 at 10:24
  • Personally when dealing with POST file uploads, even in a context of an API over which I have control of both ends, I prefer to use `multipart/form-data` messages to keep it as standards compliant as possible. There is nothing that says you can't do what you are doing, and sending multipart messages would make the data larger, but if you do it that way it makes it much easier to port between platforms - if you rewrite server, client or both in another language at some point, it will be easier because most languages have built in handlers for this format of message (i.e. `$_FILES` in PHP). – DaveRandom Feb 15 '12 at 10:26