0

I am taking some Jenkins tutorial. The sample code I read is

 pipeline {
    agent none
    stages {
        stage('Build') {
            agent {
                docker {
                    image 'python:2-alpine'
                }
            }
            steps {
                sh 'python -m py_compile sources/add2vals.py sources/calc.py'
            }
        }
        stage('Test') {
            agent {
                docker {
                    image 'qnib/pytest'
                }
            }
            steps {
                sh 'py.test --verbose --junit-xml test-reports/results.xml sources/test_calc.py'
            }
            post {
                always {
                    junit 'test-reports/results.xml'
                }
            }
        }
        stage('Deliver') { 
            agent {
                docker {
                    image 'cdrx/pyinstaller-linux:python2' 
                }
            }
            steps {
                sh 'pyinstaller --onefile sources/add2vals.py' 
            }
            post {
                success {
                    archiveArtifacts 'dist/add2vals' 
                }
            }
        }
    }
}

So basically there are three steps Build, Test and Deliver. They all use different images to generate different containers. But this Jenkins job is configured to use the Git as the SCM.

So if this Jenkins build is run, says the project is built on the first container. Then the second stage is testing the project on another container, followed by the deliver on the third container. How does this Jenkins job make sure that these three steps are performing on the code sequentially.

Based on my understanding, each stage needs to perform git clone/git pull, and before the stage finishes, the git push is required.

If SCM IS configured through Jenkins to use Git, do we need to include the git clone/git pull', as well as 'git push' in the corresponding shell scripts(understeps, or it it already taken into consideration by theSCM` function of Jenkins?

Thanks

1 Answers1

0

In this case, you must ensure that the binary that is in the QA environment must be the same as it should be in the UAT environment and then in Production. For this, you must use an artifact repository or registry (Artifactory, Nexus, Docker Registry, etc.) to promote the artifacts to the Production environment. See this link and see how it was done in the Pipeline.