3

I'm trying to understand how docker containers can access the repository of the project they are run in as part of the pipeline.

Specifically, in .gitlab-ci.yml file I can, for example, copy files from the root of my project into the docker container that is running with a simple cp command (let's say to run some tests on the code that I copied from my repo and into the container). --> How is it possible that I can freely copy files into the docker container ?

Is it set by default in Gitlab ?

From what I know there is a need to map volumes in order to be able to copy files from local host to docker but in Gitlab it seems that they are shared. Does the container in the pipeline actually have access to my entire project no matter how big it is ?

I couldn't find any documentation regarding this specific subject

caffein
  • 303
  • 1
  • 10
  • 1
    Each job is running inside one-time docker container and project source code is mapped inside using docker volumes so you do not have to do anything – Alexander Pavlov Nov 08 '20 at 22:49
  • 1
    @Alexander Pavlov - How does it happen specifically ? – caffein Nov 09 '20 at 22:07
  • 1
    they parse `.gitlab-ci.yml` and for each job create `Dockerfile` where `FROM` section is initialized using your `image:` and all `script:` commands converted to `RUN`. Then they start this docker container and map git repo inside using `-v`. I'm pretty sure it is oversimplified description but basically it is what they do. – Alexander Pavlov Nov 09 '20 at 22:31
  • 1
    `"Does the container in the pipeline actually have access to my entire project no matter how big it is ?"` see "CI / CD Settings" of your project - there is "Git shallow clone" param to limit how much it downloads – Alexander Pavlov Nov 09 '20 at 22:37
  • Awesome :) Thank you. Regarding the link - it requires some login, couldn't access. – caffein Nov 10 '20 at 17:54
  • https://docs.gitlab.com/ee/ci/runners/README.html#git-strategy – Alexander Pavlov Nov 10 '20 at 19:27

0 Answers0