I have a Python-based SimpleXMLRPCServer
similar to this:
from multiprocessing import Process
from SimpleXMLRPCServer import SimpleXMLRPCServer
from SimpleXMLRPCServer import SimpleXMLRPCRequestHandler
import SocketServer
class RPCThreading(SocketServer.ThreadingMixIn, SimpleXMLRPCServer):
pass
# Restrict to a particular path.
class RequestHandler(SimpleXMLRPCRequestHandler):
rpc_paths = ('/RPC2',)
def main():
server = RPCThreading(('127.0.0.1', 8000), requestHandler=RequestHandler)
server.register_function(tester1)
server.register_function(tester2)
print("Server running...")
server.serve_forever()
def tester1(id):
p = Process(target=my_func1, args=(id,))
p.start()
return True
def tester2(id):
p = Process(target=my_func2, args=(id,))
p.start()
return True
I want to implement a method to keep track of how many concurrent processes are currently being executed for tester1
and tester2
, and if more than a maximum (user-defined) number of them are still executing, then queue each new request and execute when the number drops below the threshold.
Maybe a shared Pool
for each function?