2

I've been trying to integrate code-coverage in my Django application.. The build is successfull and all the tests are successfull but when i check coveralls.io or codecov.io there is no data.. I have searched everything, added a .coveragerc but still nothing helps.

Dockerfile

FROM python:3.7-alpine
MAINTAINER abhie-lp

ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt

RUN apk add --update --no-cache jpeg-dev
RUN apk add --update --no-cache --virtual .tmp-build-deps \
        gcc libc-dev musl-dev zlib zlib-dev
RUN pip install -r /requirements.txt
RUN apk del .tmp-build-deps

RUN mkdir /app
WORKDIR /app
COPY ./app /app

RUN mkdir -p /vol/web/media
RUN mkdir -p /vol/web/static
RUN adduser -D ABHIE
RUN chown -R ABHIE:ABHIE /vol/
RUN chmod -R 755 /vol/web
USER ABHIE

docker-compose.yml

version: "3"

services:
  app:
    build:
      context: .
    ports:
      - "8000:8000"
    volumes:
      - ./app:/app
    command: >
      sh -c "python manage.py wait_for_db && 
             python manage.py migrate && 
             python manage.py runserver 0.0.0.0:8000"

.travis.yml

language: python
python:
  - "3.6"

services:
  - docker

before_script:
  - pip install docker-compose
  - pip install coveralls
  - pip install codecov
  - docker-compose run --user='root' app chmod -R 777 .

script:
  - docker-compose run app sh -c "coverage run --source=. manage.py test"
  - docker-compose run app sh -c "flake8"

after_success:
  - coveralls
  - codecov

.coveragerc

[run]
source = /home/travis/build/abhie-lp/recipe-app-api/app
parallel = True
data_file = /home/travis/build/abhie-lp/recipe-app-api/app/.coverage

[paths]
source = 
  /home/travis/build/abhie-lp/recipe-app-api
  /app/

1 Answers1

3

There are three major problems in the test setup you show:

  1. The volumes: declaration in the docker-compose.yml file hides the contents of the /app tree in your image, which means that your test setup is not testing the image that it built.

  2. Your pip install commands install additional packages in the host's Python environment, but these will not be visible inside the Docker container.

  3. Each docker-compose run command launches a new container with a new ephemeral filesystem, so after you docker-compose run coverage, the temporary container filesystem that had the coverage report is lost.

For basic test coverage metrics, hopefully your unit-test setup is not especially sensitive to being run in Docker, or deployed on a different path, or from a different developer's workstation. The setup I've used successfully is to run unit tests and things like code coverage outside of Docker, and only build and publish a Docker image as a final step. While it's worthwhile to run some integration tests against your built image, you should be able to drive these from outside of Docker without needing any changes in the image itself.

If it's important to you to run these tests from inside Docker, you need to either add these development-only tools to your production image, or do everything (install the extra tools, run the tests, and extract the results) from a single docker-compose run command. A one-liner could look like

docker-compose run \
  -v $PWD:/coverage \
  sh -c 'pip install coverage && COVERAGE_FILE=/coverage/.coverage coverage run --source=. manage.py test'

You could also break this out into a script that is either COPYed or bind-mounted into your container

docker-compose run -v $PWD:/coverage /coverage/cov-pytest

which might be more maintainable and a little easier to manually test.

David Maze
  • 130,717
  • 29
  • 175
  • 215