0

I've a typical transitive dependency problem for which I couldn't find a resolution.

My project uses spark and hadoop-tools dependencies. spark uses hadoop-mapreduce-client-core and hadoop-tools uses hadoop-core

hadoop-core and hadoop-mapreduce-client-core conflicts with each other. In other words, hadoop-mapreduce-client-core is a newer version (mapreduce2) of hadoop-core (mapreduce1).

In this project, I will have some executables that runs spark jobs and some that runs Distcp (depends on hadoop-tools). How do I specify this relationship/dependency/force in build.gradle so both spark flows and hadoop-tools flows finds their own dependencies at runtime.

s.r
  • 2,507
  • 3
  • 22
  • 32
  • Spark uses Hadoop core itself, so why do you need to specify those? Are you using the same versions between Spark and Hadoop? Can you show your gradle file? – OneCricketeer Oct 04 '18 at 23:05

1 Answers1

1

If you have classes with same FQCN in 2 different jars and you want to keep using both in different scenarios (as they different by their Artifact Id), then best and clean way you can a achieve this is by breaking down into a separate module.

Please refer to Gradle Multi-Project builds

https://docs.gradle.org/current/userguide/multi_project_builds.html

msuper
  • 171
  • 1
  • 2