5

we're trying to improve our Jenkins setup. So far we have two directories: /plugins and /tests.

Our project is a multi-module project of Eclipse Plugins. The test plugins in the /tests folder are fragment projects dependent on their corresponding productive code plugins in /plugins.

Until now, we had just one Jenkins job which checked out both /plugins and /tests, built all of them and produced the Surefire results etc.

We're now thinking about splitting the project into smaller jobs corresponding to features we provide. It seems that the way we tried to do it is suboptimal.

We tried the following:

  1. We created a job for the core feature. This job checks out the whole /plugins and /tests directories and only builds the plugins the feature is comprised of. This job has a separate pom.xml which defines the core artifact and tells about the modules contained in the feature.
  2. We created a separate job for the tests that should be run on the feature plugins. This job uses the cloned workspace from the core job. This job is to be run after the core feature is built.

I somehow think this is less than optimal.

  • For instance, only the core job can update the checked out files. If only the tests are updated, the core feature does not need to be built again, but it will be.
  • As soon as I have a feature which is dependent on the core feature, this feature would either need to use a clone of the core feature workspace or check out its own copy of /plugins and /tests, which would lead to bloat.
  • Using a cloned workspace, I can't update my sources. So when I have a feature depending on another feature, I can only do the job if the core feature is updated and built.

I think I'm missing some basic stuff here. Can someone help? There definitely is an easier way for this.

EDIT: I'll try to formulate what I think would ideally happen if everything works:

  • check if the feature components have changed (i.e. an update on them is possible)
  • if changed, build the feature
    • Build the dependent features, if necessary (i.e. check ob corresponding job)
    • Build the feature itself
    • if build successful, start feature test job
    • let me see the results of the test job in the feature job

Finally, the project job should

  • do a nightly build
  • check out all sources from /plugins and /tests
  • build all, test all, send results to Sonar

Additionally, it would be neat if the nightly build was unnecessary because the builds and test results of the projects' features would be combined in the project job results.

Is something like this possible?

danowar
  • 675
  • 1
  • 11
  • 21
  • 1
    This is a good question for me to ponder right now as I am working on breaking our monolithic build into stages that can be run concurrently. – jwernerny May 04 '12 at 14:32

1 Answers1

2

Starting from the end of the question. I would keep a separate nightly job that does a clean check-out (gets rid of any generated stuff before check-out), builds everything from scratch, and runs all tests. If you aren't doing a clean build, you can't guarantee that what is checked into your repository really builds.

  • check if the feature components have changed (i.e. an update on them is possible)
  • if changed, build the feature
    1. Build the dependent features, if necessary (i.e. check ob corresponding job)
    2. Build the feature itself
    3. if build successful, start feature test job
    4. let me see the results of the test job in the feature job

[I am assuming that by "dependent features" in 1 you mean the things needed by the "feature" in 2.]

To do this, I would say that you have multiple jobs.

  • a job for every individual feature and every dependent feature that simply builds that feature. The jobs should be started by SCM changes for the (dependent) feature.
  • I wouldn't keep separate test jobs from compile jobs. It allows the possibility that successfully compiled code is never tested. Instead, I would rely on the fact that wen a build step fails in Jenkins, it normally aborts further build steps.

The trick is going to be in how you thread all of these together.

Let's say we have a feature and it's build job called F1 that is built on 2 dependent features DF1.1 and DF1.2, each with their own build jobs.

  • Both DF1.1 and DF1.2 should be configured to trigger the build of F1.
  • F1 should be configured to get the artifacts it needs from the latest successful DF1.1 and DF1.2 builds. Unfortunately, the very nice "Clone SCM" plugin is not going to be of much help here as it only pulls from one previous job. Perhaps one of the artifact publisher plugins might be useful, or you may need to add some custom build steps to put/get artifacts.
jwernerny
  • 6,978
  • 2
  • 31
  • 32