93

I use docker for development and in production for laravel project. I have slightly different dockerfile for development and production. For example I am mounting local directory to docker container in development environment so that I don't need to do docker build for every change in code.

As mounted directory will only be available when running the docker container I can't put commands like "composer install" or "npm install" in dockerfile for development.

Currently I am managing two docker files, is there any way that I can do this with single docker file and decide which commands to run when doing docker build by sending parameters.

What I am trying to achieve is

In docker file

...
IF PROD THEN RUN composer install
...

During docker build

docker build [PROD] -t mytag .
Dev Khadka
  • 5,142
  • 4
  • 19
  • 33

6 Answers6

91

As a best practice you should try to aim to use one Dockerfile to avoid unexpected errors between different environments. However, you may have a usecase where you cannot do that.

The Dockerfile syntax is not rich enough to support such a scenario, however you can use shell scripts to achieve that.

Create a shell script, called install.sh that does something like:

if [ ${ENV} = "DEV" ]; then 
    composer install
else
    npm install
fi

In your Dockerfile add this script and then execute it when building

...
COPY install.sh install.sh
RUN chmod u+x install.sh && ./install.sh
...

When building pass a build arg to specify the environment, example:

docker build --build-arg "ENV=PROD" ...
yamenk
  • 46,736
  • 10
  • 93
  • 87
  • 1
    Make sure to add ARG ENV under the first FROM in the Dockerfile in order to pick up the build-args. At leaset that's what I had to do. Thanks, this was really useful. – MBillau Mar 03 '22 at 16:51
68

UPDATE (2020): Since this was written 3 years ago, many things have changed (including my opinion about this topic). My suggested way of doing this, is using one dockerfile and using scripts. Please see @yamenk's answer.

ORIGINAL:

You can use two different Dockerfiles.

# ./Dockerfile (non production)
FROM foo/bar
MAINTAINER ...

# ....

And a second one:

# ./Dockerfile.production
FROM foo/bar
MAINTAINER ...

RUN composer install

While calling the build command, you can tell which file it should use:

$> docker build -t mytag .
$> docker build -t mytag-production -f Dockerfile.production .
Christian
  • 163
  • 1
  • 10
Michael Hirschler
  • 2,345
  • 16
  • 28
11

You can use build args directly without providing additional sh script. Might look a little messy, though. But it works.

Dockerfile must be like this:

FROM alpine
ARG mode
RUN if [ "x$mode" = "xdev" ] ; then echo "Development" ; else echo "Production" ; fi

And commands to check are:

docker build -t app --build-arg mode=dev .
docker build -t app --build-arg mode=prod .
Nick Roz
  • 3,918
  • 2
  • 36
  • 57
8

I have tried several approaches to this, including using docker-compose, a multi-stage build, passing an argument through a file and the approaches used in other answers. My company needed a good way to do this and after trying these, here is my opinion.

The best method is to pass the arg through the cmd. You can pass it through vscode while right clicking and choosing build image Image of visual studio code while clicking image build using this code:

ARG BuildMode
RUN echo $BuildMode
RUN if [ "$BuildMode" = "debug" ] ; then apt-get update \
    && apt-get install -y --no-install-recommends \
       unzip \
    && rm -rf /var/lib/apt/lists/* \
    && curl -sSL https://aka.ms/getvsdbgsh | bash /dev/stdin -v latest -l /vsdbg ; fi

and in the build section of dockerfile:

ARG BuildMode
ENV Environment=${BuildMode:-debug}
RUN dotnet build "debugging.csproj" -c $Environment -o /app

FROM build AS publish
RUN dotnet publish "debugging.csproj" -c $Environment -o /app
ben3000
  • 4,629
  • 6
  • 24
  • 43
jorence jovin
  • 191
  • 1
  • 2
  • 7
3

The best way to do it is with .env file in your project. You can define two variables CONTEXTDIRECTORY and DOCKERFILENAME And create Dockerfile-dev and Dockerfile-prod

This is example of using it:

docker compose file:

services:
  serviceA:
    build: 
      context: ${CONTEXTDIRECTORY:-./prod_context}
      dockerfile: ${DOCKERFILENAME:-./nginx/Dockerfile-prod}

.env file in the root of project:

CONTEXTDIRECTORY=./
DOCKERFILENAME=Dockerfile-dev

Be careful with the context. Its path starts from the directory with the dockerfile that you specified, not from docker-compose directory.

In default values i using prod, because if you forget to specify env variables, you won't be able to accidentally build a dev version in production

Solution with diffrent dockerfiles is more convinient, then scripts. It's easier to change and maintain

  • wow, I did not know about that construction working like if empty then this `${CONTEXTDIRECTORY:-./prod_context}` thanks! – bora89 Mar 29 '23 at 11:49
0

If you run your containers with docker-compose, you can also use multistage dockerfiles. At first you declare common part that both environments use and alias it like 'base' or something else. Then in the same dockerfile you declare new FROM instruction that inherits base and runs commands for the environment.

For example. I have a ruby image and I want to run

  • bundle install in development
  • bundle install --without development test for production.

So my dockerfile would be something like this:

FROM ruby:2.7.1-slim as base
CMD ['/bin/bash']

COPY Gemfile .
COPY Gemfile.lock .

FROM base as dev
RUN bundle install

FROM base as prod
RUN bundle install --without development test

In my dev docker-compose.yml file I use this:

services:
  web:
    build:
      target: dev

In docker-compose.prod.yml this:

services:
  web:
    build:
      target: prod

You can read more about multistage dockerfiles here.

airled
  • 177
  • 3
  • 15