Given the following dockerfile:
FROM python:3.9
WORKDIR /code
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
COPY ./app /code/app
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "80"]
When we run it multiple times on the same machine, my understanding is that for RUN pip install
the cached docker layer will be used, unless we change requirements.txt
. However, given a fresh machine without the layer cache and some new released package, the same Dockerfile will lead to different packages being installed, correct?
If yes, what is the best practice to ensure
- reproducible builds
- fast builds using docker layer caching
- using the newest available packages ?
I could envision that using e.g. pip-compile --update
from pip-tools could be helpful, but understand too little about how docker caches text files.