-1

I am receiving a out of memory issue when providing the huge file data as a argument. I have checked the memory size it has free space 2045 mb. Is that sufficient for the swap space to sort the records in hash ref value. please let me know what checks can be done.


thanks for your explanation.

I am using the hash module as below.

 my ($href, @list);
$href = $self->{set}{$ret}         if ($scope eq 'set');
$href = $self->{done}{$ret}      if ($scope eq 'done');

foreach (sort keys %$href) {
  push @list, $href->{$_};
  }

return @list;

Its is reading from the oracle table.
Arun A
  • 1
  • 1
  • Your code doesn't really have any link to the file you're reading. But your hashref might be accumulating lots of data. – Sobrique Feb 24 '15 at 19:27

1 Answers1

2

As it stands, your question is pretty broad, so it's hard to give a solid answer. But the biggest question would be - what are you doing with your huge data file. If you're loading it into memory, then that's where it's gone.

The biggest gotcha is accessing the filehandle in a list context, such as foreach or assigning it as an array.

e.g.:

foreach ( <$filehandle> ) {

or

my @stuff = <$filehandle>;

Because both these operations will cause the file to be read into memory for processing, where:

while ( <$filehandle> ) {

and

my $stuff_line = <$filehandle>; 

won't.

If you're using a hash, you can 'print' it in a scalar context, to reveal how big it is.

print scalar %hash;

Which will give you an indication of how big it is.

Variable scoping can also be a factor - global scope variables 'hang around' if you're not careful, so perl can't tidy up. Perl itself will generally manage it's own memory usage, and recycle 'free' space internally, so a big number doesn't necessarily mean there's a problem.

If you're wanting some more detail, you're probably wanting to look at Devel::Size or similar

Sobrique
  • 52,974
  • 7
  • 60
  • 101