1

I'm reading in a 44 GB JSON file output from blastp with the code below:

#!/usr/bin/env perl

use strict;
use warnings FATAL => 'all';
use autodie ':all';
use Devel::Confess 'color';
use JSON qw(decode_json );
use utf8;
use open ':std', ':encoding(UTF-8)';    # For say to STDOUT.  Also default for open()

sub json_file_to_ref {
    my $json_filename = shift;
    open my $fh, '<:raw', $json_filename; # Read it unmangled
    local $/;                     # Read whole file
    my $json = <$fh>;             # This is UTF-8
    $json =~ s/NaN/"NaN"/g;
    return decode_json($json); # This produces decoded text
}

my $r = json_file_to_ref('blastp.results/homo.sapiens.json');

I get the warning Out of memory!

Is there some clever way of reading in a huge JSON file like this piece by piece, so bypass the RAM limit?

con
  • 5,767
  • 8
  • 33
  • 62

0 Answers0