3

I have two separate problems.

I am using a jenkins multibranch declarative pipeline. Within it, I have defined multiple stages.

Issue 1

I have a download from Artifactory in one stage where I am initializing my environment. The pipeline's outputs are uploaded to a different area in Artifactory in a separate stage. I see in here that buildInfo can be appended to. Here's the salient point in the aforementioned reference:

Publishing Build-Info to Artifactory

Both the download and upload methods return a build-info object which can be published to Artifactory as shown in the following examples:
def buildInfo1 = server.download downloadSpec
def buildInfo2 = server.upload uploadSpec
buildInfo1.append buildInfo2
server.publishBuildInfo buildInfo1

How can I preserve the buildinfo from the first stage to use as part of the call to publishBuilInfo in the second stage?

Issue 2

Once I have published the build info and associated artifacts, when I look into the Build Browser and specifically the Environment tab, I see that no environment or system variables are populated. I should also mention that I have followed the necessary steps to collect the environment variables as provided in the reference:

buildInfo.env.capture = true
server.publishBuildInfo buildInfo

Which brings me to a related question: does it make sense to do a collect on the first builInfo in the first stage:

buildInfo.env.collect()
Gregory Kuhn
  • 1,627
  • 2
  • 22
  • 34

1 Answers1

3

Issue 1

There are two ways to aggregate multiple builds into one buildInfo instance in a pipeline script.

The first one is exactly what you did - save the buildInfo instance returned from the server.uoload or server.downkoad methods and then use the buildInfo.append method to append (aggregate) two buildInfo instances.

The second way, which is probably what you need, is to create a buildInfo instance and send it as an argument to server.uoload or server.doenload methods. This way, you can send the same buildInfo instance to multiple upload or download methods and have it aggregate everything.

Here how you do it:

def buildInfo = Artifactory.newBuildInfo()
server.download spec: downloadSpec, buildInfo: buildInfo
server.upload spec: uploadSpec, buildInfo: buildInfo
server.publishBuildInfo buildInfo

How does it help you?

Well, since you manually create your buildInfo instance, you can define it in a scope which is above the two stages, and this way, all stages can use the buildInfo instance.

Here's an example:

node {
    // Obtain an Artifactory server instance, defined in Jenkins --> Manage:
    def server = Artifactory.server "SERVER_ID"
    // Create a buildInfo instance, to be used by the stages of this pipeline:
    def buildInfo = Artifactory.newBuildInfo()

    stage ('Upload files to Artifactory') {
        def uploadSpec = """{
            "files": [
                    { "pattern": "/Users/eyalb/.m2/repository/a*a*.jar",
                      "target": "generic-local", 
                      "excludePatterns": ["*SNAPSHOT*"],
                      "flat": "false" 
                    }
                ]
            }"""

        server.upload spec: uploadSpec, buildInfo: buildInfo
    }

    stage ('Collect env vars') {
        buildInfo.env.filter.addExclude("DONT_COLLECT*")

        // By default the filter is configured to exclude "*password*,*secret*,*key*", but since we're overriding this configuration by adding our own exclusion, let's add these excludes:
        buildInfo.env.filter
            .addExclude("*password*")        
            .addExclude("*secret*")        
            .addExclude("*key*")        

        withEnv(['DO_COLLECT_FOO=BAR', 'DONT_COLLECT_FOO=BAR']) {
            buildInfo.env.collect()
        }
    }

    stage ('Access build info env vars') {
        // BAR will printed
        echo buildInfo.env.vars['DO_COLLECT_FOO']

        // null will be printed, because we excluded it.
        echo buildInfo.env.vars['DONT_COLLECT_FOO'] 
    }

    stage ('Set build retention') {
        buildInfo.retention maxBuilds: 1, maxDays: 2, doNotDiscardBuilds: ["3"], deleteBuildArtifacts: true
    }

    stage ('Publish build info') {
        server.publishBuildInfo buildInfo
    }
}

If you'd like to use declarative pipeline, here's how you share the same buildInfo instance between multiple stages. Notice the initBuildInfo() method - it needs to be invoked only once. The below example include only two of the stages from the above scripted pipeline example:

pipeline {
    agent {
      label "my-agents"
    }
    stages {
      stage('Upload files to Artifactory') {
        steps {
            initBuildInfo()
            def uploadSpec = """{
                "files": [
                        { "pattern": "/Users/eyalb/.m2/repository/a*a*.jar",
                          "target": "generic-local", 
                          "excludePatterns": ["*SNAPSHOT*"],
                          "flat": "false" 
                        }
                    ]
                }"""

            server.upload spec: uploadSpec, buildInfo: buildInfo
        }
      }
      stage('Collect env vars') {
        steps {
            buildInfo.env.filter.addExclude("DONT_COLLECT*")

            // By default the filter is configured to exclude "*password*,*secret*,*key*", but since we're overriding this configuration by adding our own exclusion, let's add these excludes:
            buildInfo.env.filter
                .addExclude("*password*")        
                .addExclude("*secret*")        
                .addExclude("*key*")        

            withEnv(['DO_COLLECT_FOO=BAR', 'DONT_COLLECT_FOO=BAR']) {
                buildInfo.env.collect()
            }
        }
      }
    }
  }

def rtServer, buildInfo
void initBuildInfo() {
    script {
        rtServer = Artifactory.server "JX_ARTIFACTORY_SERVER"
        buildInfo = Artifactory.newBuildInfo()
    }
}

Issue 2

When you're running:

buildInfo.env.collect()

You're asking Jenkins to collect the environment variables now (at the time of the collect() method execution) and store them on this buildInfo instance.

When setting:

buildInfo.env.capture = true

You're asking Jenkins to collect environment variables with every upload and download method which uses this buildInfo. You can use this as follows:

def buildInfo = Artifactory.newBuildInfo()
buildInfo.env.capture = true
server.download spec: downloadSpec, buildInfo: buildInfo
server.upload spec: uploadSpec, buildInfo: buildInfo
server.publishBuildInfo buildInfo

Notice that you should set

buildInfo.env.capture = true

before executing the uploads or downloads.

So the advantage of using:

buildInfo.env.capture = true

is that you can set it once on your buildInfo instance and then have the env vars collected for you from that point on. On the other hand, there are scenarios when you'd like the collect the env vars at a specific point during your pipeline. That is when

buildInfo.env.collect()

comes in handy.

Eyal Ben Moshe
  • 2,400
  • 12
  • 15
  • Can you please provide an example of how you would provide this "global" buildInfo within a declarative pipeline? I had tried it with the `environment` declaration defined at `pipeline` scope but it doesn't seem to work as I think it's only for string key pairs. Thanks, I'll try your suggestion as regards the `env.capture` – Gregory Kuhn Sep 28 '18 at 08:38
  • Btw, your advice regarding the env.capture worked. Thanks loads for that! If you can provide an example about linking the buildInfo between two stages in the pipeline, I'll mark this as answered – Gregory Kuhn Sep 28 '18 at 13:43
  • @GregoryKuhn, I added an example to my answer. You can find another example here: https://github.com/jfrog/project-examples/blob/master/jenkins-examples/pipeline-examples/maven-example/JenkinsfileI hope this helps. – Eyal Ben Moshe Sep 29 '18 at 10:54
  • Hi @Eyal Ben Moshe, thanks for taking the time to provide that example. Indeed I guessed that is how it should be done but I am using a declarative pipeline, not the scripted one and I guess there is no mechanism provided with it that allows me to accomplish this. I will mark this as the accepted answer anyway. Cheers – Gregory Kuhn Sep 29 '18 at 15:02
  • You can take the same approach with declarative pipeline as well. Let me try and find an example for this. – Eyal Ben Moshe Sep 30 '18 at 00:04
  • @GregoryKuhn, I added a declarative pipeline example to the answer. I hope this helps. – Eyal Ben Moshe Oct 02 '18 at 15:06
  • thanks for the example. I have incorporated it into my pipeline and it seems to work fine. I believe, after reading some similar postings that the use of global variables are discouraged in Pipeline but, for the moment at least it works. Thanks again. – Gregory Kuhn Oct 02 '18 at 15:50
  • here is how you can fetch the build info object in declarative pipeline https://stackoverflow.com/a/72718191/852360 – LazyProphet Jun 22 '22 at 15:53