0

I'm interested in running 2 servers within a single Python script running in a Docker container. The initial dev setup that I have is with pynetdicom (see mpps-scp as an example. The main thing of interest there is the line that starts up the server running, in my file:

ae.start_server(('python_mpps', 11112), block=True, evt_handlers=handlers)

There is more detailed documentation about that here: start_server

pynetdicom implements a DICOM server. I don't think it has the ability, nor was it designed to respond to api requests using HTTP protocol.

I do have the requests module in my container to send requests to other servers, but I'd like to also add an HTTP listener, something like Flask, to add the ability to receive and process api requests, all within the same python script that is executed upon building and running my Docker container.

I guess the basic stub for Flask is something like that below, although I've never used that before. I was able to test that this basic setup works when it is launched without launching pynetdicom by just using curl: http://ip:5000/companies, where the 5000 port is mapped to 5000 on my host.

companies = [{"id": 1, "name": "Company One"}, {"id": 2, "name": "Company Two"}]
api = Flask()
@api.route('/companies', methods=['GET'])
def get_companies():
  return json.dumps(companies)
return api
api.run(host='0.0.0.0',port=5000,debug=True)

I'd really like to run both servers within the same script because they would share a number of methods that are generic to processing requests for both DICOM and API requests. I probably could create separate scripts for each server, but I'd have to maintain the methods in each, or somehow load a common library of methods into each server script.

Just wondering what the best setup and approach could be. I don't know anything about threading in Python for this kind of application.

Thank you.

davidism
  • 121,510
  • 29
  • 395
  • 339
SScotti
  • 2,158
  • 4
  • 23
  • 41
  • 1
    The recommended setup would really be two separate containers. It's not impossible to force it, but this doesn't sound like a scenario where you have any reasons to prefer that. – tripleee Jul 06 '21 at 14:32
  • That might work. I guess this is a basic framework for Flask, https://dev.to/alissonzampietro/the-amazing-journey-of-docker-compose-17lj One issue I noticed recently is that if I bind 2 different containers to a folder on the host, in some cases there is a lock on the folder and one container cannot write to it. Not sure if that is my docker setup or if one of the containers is putting a lock on the folder. If I run pynetdicom and flask in separate containers I would want to at least share a folder on the host. – SScotti Jul 06 '21 at 15:43
  • Is there any limit on number of Docker containers. I'm up to about 8 now with various degrees of networking. – SScotti Jul 06 '21 at 15:45
  • So, I built a Flask container, see this: * Environment: production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. Can I use https://hub.docker.com/r/tecktron/python-waitress to migrate to production later without too much trouble. – SScotti Jul 06 '21 at 16:14
  • python_mwl_api_1 | * Serving Flask app 'api' (lazy loading) python_mwl_api_1 | * Environment: development python_mwl_api_1 | * Debug mode: on Easy enough for now. – SScotti Jul 06 '21 at 16:38

0 Answers0