I have a bunch of swarm containers (now in one single host, but potentially in a multihost system, conforming a multinode swarm). The system is composed of a data server and a set of autonomous workers processing the bits received by the server and ingesting back the results. The input data retrieval and output data ingestion is done with HTTP GET/POST. The server side (data provision and results reception) works, and when tested against a worker running in any host, without Docker, it runs without problem. But I need to place these workers in a Docker swarm.
What I cannot achieve is the communication between the different swarm workers and the external server. My server is running in a host with IP 10.69.180.30, in the port 8080.
python test-dataserver.py -H 10.69.180.30 -p 8080
After digging in several forums, I tried to create an overlay network with --subnet 10.69.180.0/24, and then launch the service of workers connected to this network in another host. But no way.
docker network create --driver overlay --subnet 10.69.180.0/24 mynet
docker service create --network mynet --name mysrv --replicas 4 \
centos curl http://10.69.180.30:8080/get_task
In this simple example I don't call the worker but just curl
to the address of the server and the get_task
node. It should return a JSON
string with information about the task to be executed. But on the server side I do not get any request.
I'm not a network expert, so I might be missing something trivial... (BTW, I first tried to do the same, but having the server running in the same host, without success.)
Do you know if this setup is possible? I will very much appreciate any hint to make this work...
Thanks in advance!