Recently our SonarQube server was upgraded and gave us the opportunity to integrate it with GitLab, based on the configuration mentioned here:
https://docs.sonarqube.org/latest/analysis/gitlab-cicd/
Now the problem we are experiencing is that the external
sonar job fails the pipeline in case the quality gate is not reached.
While this is the correct and expected behavior, I was wondering whether there is any kind of configuration that will not make the entire pipeline fail, even if the the quality gate is not reached.
My concern is that we have develop against many projects some of which are quite old (legacy code) and might not always reach the quality gate defined for them. I know that what I am asking is not optimal -- i.e the quality gate should always be reached -- but given the current circumstances at my workplace there is no other alternative.
For reference my sonar
CI job is the following:
sonar:
stage: analysis
variables:
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar"
GIT_DEPTH: "0"
script: ./gradlew sonarqube -Dsonar.qualitygate.wait=true -Dsonar.projectKey=${CI_PROJECT_ID} ${ADDITIONAL_SONAR_OPTIONS}
allow_failure: true
rules:
- if: $CI_COMMIT_BRANCH == "master"
- if: $CI_COMMIT_BRANCH =~ /^support\/\d+[.]\d+$/ || $CI_COMMIT_BRANCH =~ /^support\/\d+$/
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_PIPELINE_SOURCE == "schedule" || $CI_PIPELINE_SOURCE == "api"
when: never
The job is set allow_failure: true
but in turn the external
job always fails. I read up on the documentation that setting this to false
should have made the pipeline not fail but it seems that is not the case.
Is there any way to make this happen? My SonarQube version is 8.1.0