1

I have a gitlab-ci.yml that works for normal stages.

.install-template: &install_env
  before_script:
    ...
  after_script:
    ... 

unit-tests:
  <<: *install_env
  stage: test
  script:
    ...
...

I am now trying to do some dynamic children pipelines. I can get them to kick off just fine, however I can't figure out how to merge "install_env" into them.

This fails:

...
trigger:
  <<: *install_env
  stage: test_run_trigger
  trigger:
    include:
      - artifact: triggers.yml
        job: create_yaml

with:

jobs:trigger_algorithms config contains unknown keys: before_script, after_script
jobs:trigger_algorithms config should contain either a trigger or a needs:pipeline

Anyone know how to get ".install-template: &install_env" into the triggered pipeline?

UPDATE: So far the only way I have gotten this to work is I have to copy the all the contents of .install-template: &install_env into the dynamically created trigger.yml and then have <<: *install_env in each trigger call.

lr100
  • 648
  • 1
  • 9
  • 29
  • I do not it is possible, you can try to submit `install_env` as a .sh file as an artifact and execute it in the child pipeline – federicober May 24 '22 at 14:25
  • would I have to write all the code from .install-template into triggers.yml? – lr100 May 24 '22 at 14:34
  • 1
    Moving the code to a shell script is good advice for eliminating duplication and sharing code between jobs. – Richard May 24 '22 at 16:17
  • 1
    tried to do this, however the before_script code creates a conda environment and once the script ends, it is not accessible. I think that is why they are using the <<: *install_env since that merges everything into the same scope. – lr100 May 24 '22 at 20:10
  • You can use the syntax: `source script.sh`. this would cause the scope to be preserved after running the .sh script – federicober May 25 '22 at 15:25

0 Answers0