0

Maven has the lame dependency conflict resolving rule of 'the closer the better': if 2 versions exists for a library, only the one closest to the trunk of the transitive dependency tree will be kept, the others will be dropped from classpath. This has caused pervasive instability in all projects build by maven.

I would like to address this problem from 2 sides:

  1. use conflict detection analyzer like JHades to scan everything in the classpath, and report version conflict from within the classloader

  2. use a fancy classloader to situationally customise the classpath, such that the correct version will be used at the right moment. This is also the method adopted by OSGi

To achieve either of them I need to override maven such that it append the jars of the further versions into the classpath in various lifecycles ('test' in particular). How do I achieve this?

UPDATE: I realise maven is an old product and wasn't design with megaproject that has thousands of dependency in mind. So if you have a solution in Gradle, Pants, Buck, sbt, or any esoteric build tools, I will still accept it as a valid answer.

tribbloid
  • 4,026
  • 14
  • 64
  • 103
  • 1
    I don't think that this can be achieved with Maven (with reasonable effort). Actually, this seems like a HUGE project that is going to be very difficult. You know that you do not have to use "the closer the better" but that you can set the transitive version by using ``? – J Fabian Meier Sep 12 '19 at 06:31
  • I know, but those hundred lines of manual read, trial and error is not scalable – tribbloid Sep 12 '19 at 15:01
  • What do you mean by "read, trial and error"? – J Fabian Meier Sep 12 '19 at 15:47
  • read the mvn dependency:tree, find the version conflicts, override the version, and pray the new version works on all libraries – tribbloid Sep 12 '19 at 16:28
  • Just note that most Java people do _not_ use OSGI or fancy classloader tricks. The most important thing is to keep the number of (transitive) dependencies low. Don't misunderstand me: I would like to avoid that cycle of playing with versions as well, but I have not seen anyone succeeding in this. – J Fabian Meier Sep 12 '19 at 18:07
  • Totally agree with you: I have little intention to do 2. But 1 is of utmost importance as it allows me to write a plugin or xml generator that can auto-shade – tribbloid Sep 12 '19 at 18:15
  • To your update: Why does your project have more than 1000 dependencies? – J Fabian Meier Sep 12 '19 at 18:20
  • because it does a lot of things! – tribbloid Sep 12 '19 at 18:27
  • I would attack the problem from this side. If you project does a lot of things, you can probably split it up into a lot of smaller projects that only have < 100 dependencies each. If these smaller projects need to communicate, they can do so e.g. by REST. This is not just my idea - it is the general trend to build small and maintainable projects as far as possible. – J Fabian Meier Sep 12 '19 at 18:35
  • at the expense of speed (due to lack of shared memory), memory & orchestration overhead, and compatibility of JVM-based distributed computing framework? Why would anyone want to do that if it can be solved automatically? – tribbloid Sep 12 '19 at 18:48
  • 1
    1. Nobody figured out yet to do it automatically (at least there is no widely used framework for that yet) 2. Memory and CPU don't matter anymore. Just buy more hardware 3. It is much easier to maintain and update smaller programs with clear responsibilities than one huge program, especially if different people maintain different of these smaller programs. – J Fabian Meier Sep 12 '19 at 19:09
  • Thanks a lot, I agree with all of them but that's the absurdity: We have adaptive & evolutionary AI algorithms, planetary-scale social network analysis tools, bots that figure out strategy games by trial-and-error. All of them put together can't tell the difference between a library of different versions. – tribbloid Sep 14 '19 at 19:14

0 Answers0