I have a concourse pipeline for a node js application with multiple jobs (unit test etc). Currently, I am doing a yarn install on every job. I would prefer to be able to do it in just one job and then pass those node modules to jobs as needed. Is there a way to do this without having to pass the modules to an S3 bucket?
Asked
Active
Viewed 436 times
1 Answers
0
I'll ask your question in a slightly different way: is there a reason you need to have multiple jobs? Would they logically make sense to be just different tasks in the same job? If you did that, you can share outputs between tasks.

Josh Ghiloni
- 1,260
- 8
- 19
-
Seperation of concerns, I don't want to wait for all of my tasks to pass to find out that the last one fails, I would rather know that before I wait for all the other tasks to finish. – Sheen Apr 14 '20 at 08:01
-
1that makes sense. the quickest path to avoiding that IMO would be to have a custom docker image that you keep up to date with your node_modules folder, and use that as your task image. – Josh Ghiloni Apr 15 '20 at 17:01
-
1@Sean If you don't want to wait for all of your tasks to pass to find out last one fails, you can execute tasks in parallel mode as per your requirement and you can even set fail-fast parameter so that if any parallel task will fail first you don't have to wait other task to be passed as per your use case and like this still you can share node_modules between tasks. But i personally won't recommend you to create docker image and keep up to date node_modules folder in your image, since whenever you'll add new dependency every time you have to update docker image. – Anshita Singh Apr 19 '20 at 10:58
-
That's what makes Concourse so powerful though: write a pipeline to keep the docker image up to date! You can have a `git` resource that only watches for changes to `package.json` or whatever you would use to collect your `node_modules` and if that changes, build the image. – Josh Ghiloni Apr 21 '20 at 13:43