I would recommend using container-based executors, i.e. Docker or Kubernetes ones: https://docs.gitlab.com/runner/executors/README.html
Then in each job of your pipeline you choose the right Docker image for the job.
It gives you the flexibility to use any python version, or indeed any image from millions at Docker Hub.
You pay the price of installing your dependencies or tools each job run on top of official image - some build time increase.
run_script:
image: python:3
script:
- pip3 install pipenv
- pipenv install
- pipenv run ./my_beautiful_script.py
You can create a custom image with your specific tools\configs\prerequisites\dependencies\environment and use it, but I prefer not too. As another benefit of such approach - you also test the installation of those, which is a best practice in Continuous Delivery.
With regards to python linters - I chose to use several of them at the same time, as they cover different aspects of quality, and I mostly run them through pre-commit hooks, which I also execute in the GitLab CI in case a developer forgets to install hooks locally.
Example of stripped down .pre-commit-config.yaml
:
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.5.0
hooks:
- id: check-yaml
- repo: https://gitlab.com/pycqa/flake8
rev: 3.7.9
hooks:
- id: flake8
- repo: https://github.com/pre-commit/mirrors-autopep8
rev: v1.5
hooks:
- id: autopep8
args: [--diff]
- repo: https://github.com/asottile/pyupgrade
rev: v2.1.0
hooks:
- id: pyupgrade
and .gitlab-ci.yml
that runs pre-commit as well as other python linters:
python-lint:
image: python:3
script:
- pip3 -q install pre-commit pycodestyle pylint
- pre-commit run -a
- pycodestyle --version
- pycodestyle --verbose --show-source *.py
- pylint --version
- find . -name "*.py" -print0 | xargs -0 pylint
See full files in the template project I made to use for all python-related code: https://gitlab.com/softmill/template-projects/python