0

In my PHP application, I receive mails using Postmarks's inbound hook. This service receives the mail and sends it JSON encoded to a URL on my server, which works fine.

The issue I have is, when the mail has attachments with more than 10MB.

Which results in

PHP Fatal error: Allowed memory size of 104857600 bytes exhausted (tried to allocate 1821693 bytes)

What I'm doing in this line is:

$in = json_decode(file_get_contents("php://input"));

I have two questions:

  1. Is there a way to do this more memory efficient?
  2. Why is it failing for 10MB mails, where the memory limit is actually 100MB? Does Base64 + JSON encoding produce an overhead which is 10 times bigger than the original size?

Edit after debuging with memory_get_usage():

Script start
47MB memory usage.
$in = file_get_contents("php://input");
63MB memory usage.
json_decode($in);
PHP terminates, due to memory size exhausted.

It is interesting, that the script already starts with a memory usage of 47MB, without issuing any command. I guess this is due to the large input data? Maybe because PHP stores it in $HTTP_RAW_POST_DATA?

Is there any php.ini directive I could use, to let PHP create less variables?

JochenJung
  • 7,183
  • 12
  • 64
  • 113

1 Answers1

2

e-mail attachments are stored as base64, so actual e-mail body will be about 2 times bigger, so we have 20mb

json_encode (at senders side) also can add base64 overhead, so we can have about 40mb for single file_get_contents call, then json_decode will need about 20mb, add some local variables, and at least 1 loop - and 100mb is exhausted

I suggest you to read about: memory_get_usage - use it to trace where php allocates memory

then use unset and gc_collect_cycles

UPDATE: I'm not sure why json_decode needs so much memory (maybe some bug, update php?), anyway in php.ini

register_globals = off
register_long_arrays = Off
register_argc_argv = Off
auto_globals_jit = On
always_populate_raw_post_data = Off

You 2nd question: base64

Thus, the actual length of MIME-compliant Base64-encoded binary data is usually about 137% of the original data length

JSON should not add big overhead, but additional encoding of mail body into json could probably use base64 again

Iłya Bursov
  • 23,342
  • 4
  • 33
  • 57
  • Nice. I have not used `memory_get_usage` so far, but i will use it from now on, when i will have the chance. Up vote from me. – Cristian Bitoi Oct 18 '13 at 20:43
  • Thanks for the debugging hint with memory_get_usage. +1 I just updated my question with the debugging results. – JochenJung Oct 19 '13 at 05:38
  • PHP Version is 5.4.9. I managed to decrease the initial memory size a little (from 47MB to 31MB) by unsetting all global variables after script start. I guess thats all I can do, besides increasing the memory limit. unset($GLOBALS); unset($_SERVER); unset($_GET); unset($_POST); unset($_FILES); unset($_REQUEST); unset($_ENV); unset($_COOKIE); unset($HTTP_RAW_POST_DATA); unset($http_response_header); unset($argc); unset($argv); gc_collect_cycles(); – JochenJung Oct 19 '13 at 06:05
  • @JochenJung look at http://stackoverflow.com/questions/15077870/serial-json-decode-due-to-memory-limit maybe you can go with your own JSON parser, as you know that format is fixed? – Iłya Bursov Oct 19 '13 at 06:07