0

I am trying to write a metamorphic quine. Without the "spawn" context, the subprocesses seem to inherit the stack, and so I ultimately exceed the max recursion depth. With the "spawn context," the subprocess doesn't seem to recurse. How would I go about executing the modified AST?

def main():
    module  = sys.modules[__name__]
    source  = inspect.getsource(module)
    tree    = ast.parse(source)

    visitor = Visitor() # TODO mutate
    tree    = visitor.visit(tree)
    tree    = ast.fix_missing_locations(tree)

    ctx     = multiprocessing.get_context("spawn")
    process = ctx.Process(target=Y, args=(tree,))
    # Y() encapsulates these lines, since code objects can't be pickled
    #code    = compile(tree, filename="<ast>", mode='exec', optimize=2)
    #process = ctx.Process(target=exec, args=(code, globals())) # locals()

    process.daemon = True
    process.start()
    # TODO why do daemonized processes need to be joined in order to run?
    process.join()

    return 0

if __name__ == '__main__': exit(main())
  • Why did you turn on `process.daemon = True` if you don't want what it does? – user2357112 Feb 04 '22 at 01:08
  • You don't need to join a daemon process to make it run. Rather, when a process ends, it tries to *kill* all its daemon child processes. That's what's stopping the child from running. – user2357112 Feb 04 '22 at 01:10
  • Also, this isn't metamorphic code, or a quine. – user2357112 Feb 04 '22 at 01:13
  • Thanks for this tip. I falsely assumed that setting that flag would create a daemon process. How to spawn a subprocess without waiting or killing it when the parent terminates? I may have misused the terms. It modifies its own AST, so I thought "metamorphic" would apply. I removed the print statements that dumped the AST before and after the transformations, so I thought it was a "quine," or at least a program that has quine-like behavior. – Innovations Anonymous Feb 04 '22 at 02:17
  • 1
    If you want to create a daemon process in the sense of a Unix daemon, the [python-daemon library](https://pypi.org/project/python-daemon/) handles that. Trying to turn a multiprocessing.Process into a Unix daemon is probably a bad idea, though. multiprocessing is not intended as a general subprocess handling library, even for Python subprocesses. It's designed around creating workers for parallelizing tasks. – user2357112 Feb 04 '22 at 02:23
  • 1
    `multiprocessing` tries very hard (and fails miserably) to pretend it's just like `threading`. The `Process.daemon` flag is intended to be analogous to `Thread.daemon` from `threading`. – user2357112 Feb 04 '22 at 02:25

1 Answers1

0

It really is that easy. with daemon.DaemonContext(): foo() Based on comments by @user2357112 supports Monica.

@trace
def spawn_child(f:Callable):
    with daemon.DaemonContext(stdin=sys.stdin, stdout=sys.stdout): return f()

I = TypeVar('I')
def ai(f:Callable[[int,], I])->Callable[[int,], I]:
    def g(*args, **kwargs)->int:
        # assuming we have a higher-order function morph()
        # that has a concept of eta-equivalence
        # (e.g., a probabilistic notion),
        # then the recursive call should be "metamorphic"
        O = [morph(f), status, partial(spawn_child, f),]
        i = random.randrange(0, len(O)) # TODO something magickal
        return O[i]()
    return g

def main()->int: return Y(ai)()

if __name__ == '__main__': exit(main())

The next problem is compiling the source for a nested function definition, since f() is not a reference to ai() but to a function defined within Y().