-1

I've found the solution for the previous problem, but there is another one.

But for this, I didn't find any fix yet.

The code:

[...]
use HTTP::Daemon
use Parallel::ForkManager;

PM with , for example 3 processes MAX
[...]

while (1)
{
    $inputcon = $daemon->accept();

    $pm->start and next; #fork

    do_client_stuff($inputcon);

    $pm->finish();
}

When I doing wgets on this script all is working OK, I see the children on the processes list but there is a problem with last (always with last)

The last child process stays as zombie, always. When I do one wget more, this zombie process exits normally and another (this one from this current wget query) becomes a zombie

5989 pts/5    S+     0:00      \_ grep test.pl
5975 pts/4    S+     0:00      \_ /usr/bin/perl ./test.pl
5987 pts/4    Z+     0:00          \_ [test.pl] <defunct>

You know, last child process is always a zombie. Don't know why all processes are working OK, but the last is not.

Any hint, solution?

Thank you.

// sorry for my english


Here is the sample code. One wget on 127:8080 makes the child process is a zombie. But script is working, new query / new zombie PID.

#!/usr/bin/perl

use HTTP::Daemon;
use Parallel::ForkManager;

$daemon = new HTTP::Daemon(LocalPort => 8080, LocalAddr => "127.0.0.1", Listen => 64, ReuseAddr => 1) or die "$!";

$pm = Parallel::ForkManager->new(3);

while (1)
{
    $inputcon = $daemon->accept();

    $pm->start and next; 

    do_client_stuff($inputcon);

    $pm->finish();
}

sub do_client_stuff
{
    my ($inputcon) = @_;

    $request = $inputcon->get_request;

    print $request . "\n";

    $inputcon->send_error(403);
 }
ChrisF
  • 134,786
  • 31
  • 255
  • 325
makowiecki
  • 67
  • 4

3 Answers3

2

You're missing

$pm->wait_all_children;
ikegami
  • 367,544
  • 15
  • 269
  • 518
  • I have this, but this is after the while so it will work when I kill 'gently' the process, should I add it somewhere inside? – makowiecki May 28 '13 at 20:01
  • No, `finish` will reap inside the loop as needed. – ikegami May 28 '13 at 20:01
  • As you can see, this is not working. Last is always – makowiecki May 28 '13 at 20:03
  • I meant `start`, not `finish`. – ikegami May 28 '13 at 20:19
  • And I already explained how to reap that process. You're missing `$pm->wait_all_children;`. – ikegami May 28 '13 at 20:20
  • you suggestion does not work, i added this to my code but nothing changed at all - last child = zombie – makowiecki May 29 '13 at 09:22
  • as you can see this loop never ends, so using wait_for_all_childrem outside it is useless, i think... – makowiecki May 29 '13 at 09:48
  • Whan you call start(), and # of processes has reached max, it waits for a process to finish, reaps it, and starts the new process. If your loop never ends, then there will always be at least one unreaped process. That's how it works, there is nothing wrong here, so nothing to fix. – runrig May 29 '13 at 15:44
  • So the only zombie is the one you've already admitted already gets reapead? Then what's the problem? – ikegami May 29 '13 at 16:56
  • When i trying to do this with proc::queue and fork - no problem, working like a charm without any zombies. But when trying to reproduce this simple code with parallel::forkmanager, i have 'walkers', damn... So there is no solution for this, in while the last child process will be a zombie always? $SIG{CHLD} = 'ignore' fixes the problem for PM, but after MAX childs PM get stuck. – makowiecki May 29 '13 at 18:44
  • Proc::Queue forks without reaping. You explicitly call waitpid (which reaps) after forking when using that library. Again, what's the problem? – runrig May 30 '13 at 21:02
  • "When i trying to do this with proc::queue and fork - no problem" You have yet to describe any problem with P::FM either – ikegami May 30 '13 at 21:05
2

It can be fixed by using the following code:

my $pm = Parallel::ForkManager->new(3);
$SIG{CHLD} = sub{  Parallel::ForkManager::wait_children($pm) };
iseif
  • 297
  • 2
  • 14
1

maybe you should make your child processes sleep a while? I had similar problem with zombies, but in my case it was file reading, so children did they work faster than the next line of file was read, and making them sleep solved zombie problem.

Vickie
  • 33
  • 3