1

My Perl script have weird behaviour which I don't understand. I'm processing large structure stored as array of hashes which is growing while processing. The problem is that structure has about max 8mb when I store it on hdd, but while it is processing it takes about 130mb of ram. Why there is so big difference?

The main flow of proccessing looks like:

while(...)
{
    my %new_el = %{Storable::dclone \%some_el};

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, \%new_el; 
}
Ether
  • 53,118
  • 13
  • 86
  • 159
jesper
  • 879
  • 8
  • 21

1 Answers1

1

You are making more copies of the data than you need to. Try working with hashrefs rather than dereferencing, as much as possible:

while (...)
{
    my $new_el = Storable::dclone \%some_el;

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, $new_el; 
}

Even better would be to not clone the entire hash -- perhaps you can get away with altering it in-place?

Ether
  • 53,118
  • 13
  • 86
  • 159
  • But what does it change? What means that I'm making more copies than I need? I don't get it. I've change the code in way you adviced but it changes nothing in memore usage. – jesper Apr 09 '10 at 19:00
  • The difference between dealing with hashes and hashrefs is you are not making copies of all the keys and values -- but if you are not seeing any improvement, you will have to examine how and why you are copying data, and deal with your data in smaller chunks. – Ether Apr 10 '10 at 16:22