0

Hi I'm trying to add multiple keys/values into a hash. Basically file names and their data. Each of the json file's content have hash and array referencing. The hash that contains the file names and data will be handled else where.

This is my code:

 sub getDecode {
    my $self = shift;
    my @arrUrls = ('http://domain.com/test.json', 'http://domain.com/test_two.json', 'http://domain.com/test3.json');

    my $resQueue = Thread::Queue->new();
    my $intThreads = 10;
    my @arrThreads = ();
    my %arrInfo = ();

    foreach my $strUrl (@arrUrls) {
          for (1..$intThreads) {
              push (@arrThreads, threads->create(sub {
                          while (my $resTask = $resQueue->dequeue) {
                                      my $resData = get($strUrl);
                                      my $strName = basename($strUrl, '.json');
                                      my $arrData = decode_json($resData);
                                      $arrInfo{$strName} = $arrData;
                          }
              }));
          }
   }

   $resQueue->enqueue(@arrUrls);
   $resQueue->enqueue(undef) for 1..$intThreads;
   $_->join for @arrThreads;

  return %arrInfo;       
  }

When I try data dumping on %arrInfo no output is given. Please help!

user2524169
  • 267
  • 1
  • 2
  • 12

1 Answers1

1

You're multithreading and not sharing the variable. When a thread spawns, the existing variable-space is cloned - so each thread has it's own local copy of %arrInfo which is discarded when it exits.

You need to:

use threads::shared;
my %arrInfo : shared;

You're also doing something a bit odd with your thread spawn loop - you're spawning 10 threads x 3 URLs - for 30 threads, but only queuing 3 URLs to process. But then you're also not actually using $resTask at all, which doesn't make a lot of sense.

So I'm prepared to bet your code hangs at the end, because you're trying to join some threads that aren't complete.

You might find $resQueue -> end() is more suitable than queuing up undef.

Example using a shared hash:

use strict;
use warnings;
use threads;
use threads::shared;

use Data::Dumper;

my %test_hash : shared;
my %second_hash;

$test_hash{'from_parent'}       = 1;
$second_hash{'from_parent_too'} = 1;

threads->create(
    sub {
        $test_hash{'from_first'}       = 2;
        $second_hash{'from_first_too'} = 2;
    }
);
threads->create(
    sub {
        $test_hash{'from_second'}       = 3;
        $second_hash{'from_second_too'} = 3;
    }
);

foreach my $thr ( threads->list() ) { $thr->join }

print Dumper \%test_hash;
print Dumper \%second_hash;

For a 'worker threads' style approach, I'd offer: Perl daemonize with child daemons

Community
  • 1
  • 1
Sobrique
  • 52,974
  • 7
  • 60
  • 101