0

I seem to be hitting a snag in my program. I have an image gallery that has the function to download the raw images. I currently use the zip library of Codeigniter. When people are downloading 5-15 images there is no issue what so ever. But as soon as the try 50+ (or something close) I get a error.

Fatal error: Allowed memory size of 134217728 bytes exhausted.

It links back to the zip library line 410.

I looked online but the only thing I found that where a so-called solution was to change the php.ini file and increase the limitations. The limitations are currently set to 128 MB and as we are using a hosting service that does not allow changes in these files I am at a loss I suppose.

I tried the ini_set('memory_limit','1G');. But when I do that I receive Fatal error: Maximum execution time of 30 seconds exceeded. After this, I added the "time limit" and got the memory limit error again.

Just for note the biggest folder of images at this time is around 680MB. But in the future, it might go around 2-3 GB

Is there any solution or different thing I can try to get this to work? Maybe a function to split the download up in bite-size chunks? Or something that takes it's time and does not eat up memory? Any ideas will be appreciated.

Any ideas would be greatly appreciated.

Shawn
  • 1,232
  • 1
  • 14
  • 44
  • you might try reading this one. This may help https://stackoverflow.com/questions/18546045/php-download-file-limit-max-speed-and-calculate-downloading-speed ,,,,, or try using "ini_set('memory_limit','10G');".,,, and set_time_limit(0); – Jenuel Ganawed Jan 20 '20 at 01:14
  • the post you showed was someone asking for a download limit speed. And the answer was there is no option in php. So i guess that aint a solution either. Setting the limits might work but it seems like a solution that is clumsy and might need raising in the future. I would love a solution that is easyly scalable. – Pieter-Jan Casteels Jan 20 '20 at 07:25

1 Answers1

0

Ok the issue was the buffer on the webserver. When the zip file is compressing it is keeping most of the files in the temp cache. I found a solution after a long time and even giving up at some point.

Streaming zip file whithout temp files

A very good read and very good explenation of the issues that resolve around this problem. And a solution.

In essence the solution is to zip the files whithout compression and streaming it directly to the user, either by folder or file by file. In this way there is no need for the temp cach files therefore eliminating the need for high performance zipping.