25

I'm trying to move existing Jenkins build jobs to a single Jenkins 2 pipelines, and wondering if it's possible to copy files from one node to another within the build. My idea would be :

Node A (Windows)
  Checkout scm
  Execute ant build
  Archive artifact (or whatever required action)
Node B (Unix)
  Checkout scm
  Copy build artifact from node A --> is this possible ?
  Execute ant build
  Then followed by tests...

I've tried to use the copy artifact step, but it didn't seem to work correctly, so I'm wondering if there's a way to copy files in the middle of the pipeline, or if I have to stay with the current build architecture (using copy artifact plugin, but with completely separate build jobs).

StephenKing
  • 36,187
  • 11
  • 83
  • 112
Gilles QUERRET
  • 563
  • 2
  • 5
  • 19
  • Welcome to stackoverflow. You could just include the code that "didn't seem to work correctly" in your posting... ;-) – StephenKing Jun 07 '16 at 20:27
  • I was using `step([$class: 'ArtifactArchiver', artifacts: 'dist/*.zip'])` to archive the artifact on the first node, and `step([$class: 'CopyArtifact', filter: 'dist/*.zip', fingerprintArtifacts: true, projectName: 'PCT')` but artifacts seem to be only available after the end of the build – Gilles QUERRET Jun 08 '16 at 05:51

1 Answers1

28

Yes, this is possible using the stash/unstash steps.

A tutorial about this can also be found in the Jenkins Blog (focused on parallel execution):

parallel (
    "stream 1" : { 
                     node { 
                           unstash "binary"                           
                           sh "sleep 20s" 
                           sh "echo hstream1"
                       } 
                   },
    "stream 2" : { 
                     node { 
                           unstash "binary"
                           sh "echo hello2"
                           sh "hashtag fail"                                                       
                       } 
                   }
          )
StephenKing
  • 36,187
  • 11
  • 83
  • 112
  • Feel free to provide an additional answer. – StephenKing Jun 12 '18 at 08:54
  • Still searching for it :) probably this https://stackoverflow.com/a/46567313/3679490 ? – vks Jun 12 '18 at 08:55
  • 3
    I actually wanted to know if it can be used for big files.Jenkins docs says it should be used for files less than 5 MB.Have u used it for larger files? Will it work? – vks Jun 12 '18 at 09:00
  • 1
    It will definitely work also for larger files. We used the usual (un)stash for a while to copy over about 400MB artifacts to up to 8 agents. As this made multiple parallel builds hard, because it makes the Jenkins master pretty unresponsive, we refactored our pipeline to copy files over via S3 and keep Jenkins master out of the loop. But for the beginning (or if you have not so many parallel builds), I really recommend to start simple. Go ahead and implement your pipeline using (un)stash. If it becomes a pain point, upload to some external data store. – StephenKing Jun 12 '18 at 22:56
  • Nothing special, just the `s3Upload` step provided by the _Pipeline AWS Steps_ plugin – StephenKing Jun 13 '18 at 05:44