2

I read value of a file csv as:

 //$mypath . '/' . $filename <=> ../abc.csv
 $val = file_get_contents($mypath . '/' . $filename);                                       

 $escaped = pg_escape_bytea($val);

 $model->addFileImport($tmp, $data['email'], $escaped);

My file ia about 100MB. In php.ini setting : memory_limit = 128M

But it still show errort:Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate 133120 bytes) in... at row: $val = file_get_contents($mypath . '/' . $filename);

I had fixed by add ini_set('memory_limit', '-1');:

 //$mypath . '/' . $filename <=> ../abc.csv
 ini_set('memory_limit', '-1');
 $val = file_get_contents($mypath . '/' . $filename);                                       

 $escaped = pg_escape_bytea($val);

 $model->addFileImport($tmp, $data['email'], $escaped);

But it show error :

Fatal error: Out of memory (allocated 230686720) (tried to allocate 657099991 bytes) in C:\wamp\www\joomlandk\components\com_servicemanager\views\i0701\view.html.php on line 112

at row $escaped = pg_escape_bytea($val);

Why? How fix that error?

zajonc
  • 1,935
  • 5
  • 20
  • 25
mum
  • 1,637
  • 11
  • 34
  • 58
  • You ran out of memory, physical memory as far as I can tell. 657099991 bytes is 626.659 Megabytes. You best bet is to [use a bigger swap file](https://www.google.com/search?q=increase+size+of+swap+file). – Mark Tomlin Jan 18 '13 at 07:49

2 Answers2

1

According to the doc

pg_escape_bytea() escapes string for bytea datatype. It returns escaped string.

When you SELECT a bytea type, PostgreSQL returns octal byte values prefixed with '\' (e.g. \032). Users are supposed to convert back to binary format manually.

meaning a single input byte gives 4 bytes, i.e. 4 times the initial size

You need a lot of RAM to handle your file (maybe your system cannot allocate that much memory - even without the PHP limit). A solution is to process it from 40MB chunks, made, for instance with fread() and fwrite() functions.

  $val = file_get_contents($mypath . '/' . $filename);

will take 100MB - thus the next line takes 400 MB, total 500MB. You need to read less from *file_get_contents*, like reading only 20 (or 40) MB at a time

  • Read 20MB of the file with fread (instead of file_get_contents)
  • process that 20MB with *pg_escape_bytea* (total 100MB)
  • Repeat the process until the file is fully processed
Déjà vu
  • 28,223
  • 6
  • 72
  • 100
  • But it show error at row : $escaped = pg_escape_bytea($val); I must convert string to bytea Insert to database. – mum Jan 18 '13 at 07:34
  • Yes because the 1st line *file_get_contents* takes 100MB, then the *pg_escape_bytea* line takes 4 times, ie 400MB. Total 500MB... This is why the error comes there - if you could *file_get_contents* only (say) 25MB, the total would be only 125MB. See the edit – Déjà vu Jan 18 '13 at 07:42
  • Can i set memeory for webserver? – mum Jan 18 '13 at 08:21
  • @mum please refer to [How to save memory when reading a file](http://stackoverflow.com/questions/2603807/how-to-save-memory-when-reading-a-file-in-php) – Gordon Jan 18 '13 at 08:22
-1

Try to add:

ini_set('memory_limit', '700M');
Mantas Vaitkūnas
  • 704
  • 1
  • 7
  • 17
  • 1
    it show error: Fatal error: Allowed memory size of 734003200 bytes exhausted (tried to allocate 657099991 bytes) in C:\wamp\www\joomlandk\components\com_servicemanager\views\i0701\view.html.php on line 114: $val = file_get_contents($mypath . '/' . $filename); – mum Jan 18 '13 at 07:22