2

I'm using this Dockerfile as part of this docker compose file.

Right now, every time I want to add a new pip requirement, I stop my containers, add the new pip requirement, run docker-compose -f local.yml build, and then restart the containers with docker-compose -f local.yml up. This takes a long time, and it even looks like it's recompiling the container for Postgres if I just add a pip dependency.

What's the fastest way to add a single pip dependency to a container?

Myer
  • 3,670
  • 2
  • 39
  • 51

1 Answers1

4

This is related to fact that the Docker build cache is being invalidated. When you edit the requirements.txt the step RUN pip install --no-cache-dir -r /requirements/production.txt and all subsequent instructions in the Dockerfile get invalidated. Thus they get re-executed.

As a best practice, you should avoid invalidaing the build cache as much as possible. This is achieved by moving the steps that change often to the bottom of the Dockerfile. You can edit the Dockerfile and while developing add separate pip installation steps to the end.

...

USER django

WORKDIR /app

pip install --no-cache-dir <new package>
pip install --no-cache-dir <new package2>

...

And once you are sure of all the dependencies needed, add them to the requirements file. That way you avoid invalidating the build cache early on and only build the steps starting from the installation of the new packages on ward.

yamenk
  • 46,736
  • 10
  • 93
  • 87
  • One question - am I wrong in thinking that given the docker compose file it recompiles the unrelated containers as well upon docker-compose build? – Myer Jan 27 '18 at 10:25
  • That shouldn't be the case, unless other images are being invalidate due to changes in files being copied to them. – yamenk Jan 27 '18 at 11:08