9

My php script running on CentOS 5.6 and PHP 5.2.12 using ZipArchive() and successfully creates .zip files over 1.6Gb but not for a larger archive of 2GB or larger - PHP aborts with no apparent error. Nothing in the PHP error log or stderr. The script is being executed at the cmd line and not interactively.

The script runs for about 8min and the temp archive grows and while checking the filesize, the last listing showed the tmp file was 2120011776 in size and then the tmp file disappears and the PHP script falls thru the logic and executes the code after the archive create.

For some reason top shows the CPU still at 95% and is creating a new tmp archive file - it does this for say another 5+ min and silently stops and leaves the un-completed tmp archive file. In this test - there was less then 4000 expected files.

The script as noted works just fine creating smaller archive files.

Tested on several different sets of large source data - same result for large files.

This issue sounds similar to this question: Size limit on PHP's zipArchive class?

I thought maybe the ls -l command was returning a count of 2K blocks and thus 2120011776 would be close to 4GB but that size is in bytes - the size of the xxxx.zip.tmpxx file.

Thanks!

Community
  • 1
  • 1
blainelang
  • 91
  • 1
  • 3
  • 11
    If your PHP runs an 32bit server, the ZIP extension might be internally limited to 2GB files (signed integer). Not sure if the .zip format itself functions beyond that anyway. – mario Apr 21 '11 at 14:13
  • If the script isn't totally dying then it's not a time or memory limit issue. There must be a limit in the Zip library. Do the docs suggest any kind of upper limit? – Neil Aitken Apr 21 '11 at 14:35
  • No limit that I have been able to see. I would expect to see a PHP error if memory was exhausted and as noted the script continues to execute code after the ziparchive has completed/failed in my case. – blainelang Apr 21 '11 at 19:46
  • 1
    it's not a memory issue it's a file size issue if effectively the server is a 32bit it cannot handle bigger file and the error might not be that obvious because if the library doesn't check it it's just wrapup over the file a second time or more and the file doesn't grow, but is surely corrupted. – Jonatan Cloutier Apr 22 '11 at 13:49

4 Answers4

3

It could be many things. I'm assuming that you have enough free disk space to handle the process. As others have mentioned, there could be some problems fixed either by editing your php.ini file or using the ini_set() function in the code itself.

How much memory does your machine have? If it exhausts your actual memory, then it makes sense that it would abort regularly after a certain size. So, check the free memory usage before the script and monitor it as the script executes.

A third option could be based on the file system itself. I don't have much experience with CentOS, but some file systems do not allow files over 2 gb. Although, from the product page, it seems like most systems on CentOS can handle it.

A fourth option, which seems to be the most promising, appears if you look at the product page linked above, another possible culprit is "Maximum x86 per-process virtual address space," which is approximately 3gb. x86_64 is about 2tb, so check the type of processor.

Again, it seems like the fourth option is the culprit.

Shawn Patrick Rice
  • 800
  • 10
  • 17
-1

Do you have use set_limit variables in php.

You can use the. Htacess or within the PHP script. Inside the script set_time_limit(0); Inside the .htaccess php_value memory_limit 214572800;

-1

When your file size is big it will take time to make its archive ZIP, but in PHP (php.ini) maximum execution time, so you must try to increase that value.

-2

there is a setting in php.ini maximum execution time
perhaps this is getting fired !
try to increase the value !

There is also different file size limit for OS, try to check that too !

Sourav
  • 17,065
  • 35
  • 101
  • 159
  • 1
    The script runs for 8 minutes and doesn't die, the zip method returns before it's complete. This suggests it's not a time limit issue. – Neil Aitken Apr 21 '11 at 14:36