0

I have a simple Python program that I want to run in IBM Cloud functions. Alas it needs two libraries (O365 and PySnow) so I have to Dockerize it and it needs to be able to accept a Json feed from STDIN. I succeeded in doing this:

FROM python:3
ADD requirements.txt ./
RUN pip install -r requirements.txt
ADD ./main ./main
WORKDIR /main
CMD ["python", "main.py"]

This runs with: cat env_var.json | docker run -i f9bf70b8fc89

I've added the Docker container to IBM Cloud Functions like this:

ibmcloud fn action create e2t-bridge --docker [username]/e2t-bridge

However when I run it, it times out.

Now I did see a possible solution route, where I dockerize it as an Openwhisk application. But for that I need to create a binary from my Python application and then load it into a rather complicated Openwhisk skeleton, I think?

But having a file you can simply run was is the whole point of my Docker, so to create a binary of an interpreted language and then adding it into a Openwhisk docker just feels awfully clunky.

What would be the best way to approach this?

Herman
  • 750
  • 1
  • 10
  • 23

1 Answers1

2

It turns out you don't need to create a binary, you just need to edit the OpenWhisk skeleton like so:

# Dockerfile for example whisk docker action
FROM openwhisk/dockerskeleton

ENV FLASK_PROXY_PORT 8080

### Add source file(s)
ADD requirements.txt /action/requirements.txt
RUN cd /action; pip install -r requirements.txt

# Move the file to 
ADD ./main /action
# Rename our executable Python action
ADD /main/main.py /action/exec

CMD ["/bin/bash", "-c", "cd actionProxy && python -u actionproxy.py"]

And make sure that your Python code accepts a Json feed from stdin:

json_input = json.loads(sys.argv[1])

The whole explaination is here: https://github.com/iainhouston/dockerPython

Herman
  • 750
  • 1
  • 10
  • 23