System
Docker version 20.10.8, build 3967b7d
- Windows 10 Pro with Docker Desktop
As a requirement I have to port my Python3.x application to support working on arm/v7
architecture hardware. I have GitHub Workflows that can build for linux/arm64
and linux/amd64
platforms/architectures. One of the dependency is numpy
which, during build phase causes build times to exceed more than 30 mins.
Its wheel creation phase does not seem to be moving. In order to avoid complexity in my builds I avoid using alpine
based images but stick to slim
images and install the necessary packages in a multi-stage docker build
Dockerfile looks the following:
FROM python:3.7-slim AS compile-image
# This prevents Python from writing out pyc files
ENV PYTHONDONTWRITEBYTECODE 1
# This keeps Python from buffering stdin/stdout
ENV PYTHONUNBUFFERED 1
RUN apt-get update
RUN apt-get install -y --no-install-recommends build-essential gcc
RUN python -m venv /opt/venv
# Make sure we use the virtualenv:
ENV PATH="/opt/venv/bin:$PATH"
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY setup.py .
COPY . .
RUN pip install .
FROM python:3.7-slim AS build-image
COPY --from=compile-image /opt/venv /opt/venv
COPY scripts/docker-entrypoint.sh /entrypoint.sh
# Make sure we use the virtualenv:
ENV PATH="/opt/venv/bin:$PATH"
RUN chmod +x /entrypoint.sh
ENTRYPOINT [ "/entrypoint.sh" ]
CMD ["app", "-c", "config.yaml"]
Outputs
docker buildx build --platform linux/arm/v/7 -t myDockerAcc/pyapp .
[+] Building 162.2s (8/17)
[+] Building 1554.2s (10/17)
=> [internal] load build definition from Dockerfile 0.1s
=> => transferring dockerfile: 1.67kB 0.0s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/library/python:3.7-slim 2.2s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> CACHED [build-image 1/4] FROM docker.io/library/python:3.7-slim@sha256:c2cc09c3de140f59b3065b9518fa7beb5fbedb4414762963bfe01079ce219f2e 0.0s
=> => resolve docker.io/library/python:3.7-slim@sha256:c2cc09c3de140f59b3065b9518fa7beb5fbedb4414762963bfe01079ce219f2e 0.0s
=> [internal] load build context 0.7s
=> => transferring context: 4.77kB 0.7s
=> [compile-image 2/9] RUN apt-get update 31.8s
=> [compile-image 3/9] RUN apt-get install -y --no-install-recommends build-essential gcc 102.7s
=> [compile-image 4/9] RUN python -m venv /opt/venv 55.8s
=> [compile-image 5/9] COPY requirements.txt . 0.3s
=> [compile-image 6/9] RUN pip install --no-cache-dir -r requirements.txt 1361.0s
=> => # Building wheel for numpy (PEP 517): started
=> => # Building wheel for numpy (PEP 517): still running...
=> => # Building wheel for numpy (PEP 517): still running...
Are there certain optimizations one needs to setup / configure during such cross-platform builds so that build times for wheel creations of numpy
scipy
or pandas
would be reduced?