I have previously implemented jobs that run in parallel when a change is done to a file.
Validating the Data:
stage: validation
parallel:
matrix:
- TEAM:
[
"data",
"monitoring",
"sales",
"web"
]
rules:
- changes:
- mainfolder/$TEAM/config.yaml
For this specific example it is working perfectly. Now I have a different project where the amount of $TEAM subfolders is very high and it is increasingly very fast. This means that I have to manually update the matrix array.
Is there a way of triggering the job in parallel for each subfolder without having to modify the array all the time?
Just to reiterate the amount of subfolders could reach hundreds or thousands.
Thank you very much.
I searched for multiple solutions on the internet but could not find one specific that could suggest me the solution.
This was already posted but it is 5 years old and there might be a solution... or not GitLab Dynamically run jobs in parallel