I have a project with the following strucure :
vineetkalghatgi@vinux:~/personal-projects/bertQA_server$ ls
bert-server-env cloudbuild.yaml Dockerfile main.py mymodel Procfile __pycache__ README.md requirements.txt target
My .dockerignore:
Dockerfile
README.md
*.pyc
*.pyo
*.pyd
__pycache__
.pytest_cache
bert-server-env
This is my Dockerfile
FROM tensorflow/tensorflow
# Allow statements and log messages to immediately appear in the Knative logs
ENV PYTHONUNBUFFERED True
# Copy local code to the container image.
ENV APP_HOME /app
WORKDIR $APP_HOME
ADD mymodel .
COPY . .
# Install dependencies
RUN apt-get update \
&& apt-get upgrade -y \
&& apt-get autoremove -y \
&& apt-get clean
# Install production dependencies.
RUN pip3 install -r requirements.txt
# Run the web service on container startup. Here we use the gunicorn
# webserver, with one worker process and 8 threads.
# For environments with multiple CPU cores, increase the number of workers
# to be equal to the cores available.
# Timeout is set to 0 to disable the timeouts of the workers to allow Cloud Run to handle instance scaling.
CMD exec gunicorn --bind :$PORT --workers 1 --threads 8 --timeout 0 main:app
I previously just had COPY . .
without the ADD
and noticed that the 'mymodel' folder was not getting copied.
If i run gcloud build submit . --tag <my-gcr.io-container
I get the following error:
Step 5/9 : ADD mymodel .
ADD failed: file not found in build context or excluded by .dockerignore: stat mymodel: file does not exist
And I am able to build the image locally with docker build
with no issues.