Here is my scenario. I have a client jar that uses glassfish jersey version 2.17 jar. Now I am changing an application to use this client jar that will be deployed in Spark cluster. The spark uses com.sun.jersey
version 1.9 jar.
In the client jar, when a jersey request is made, it refers javax.ws.rs.core.MultivaluedMap
from javax.ws.rs-api-2.0.1.jar
in ClientRequest.accept
method. Unfortunately this same class with the same package name is also present in jersey-core-1.9.jar
which is referred by com.sun.jersey
which is used my Apache spark. Now the javax.ws.rs-api-2.0.1.jar
version contains a method called as addAll(...)
which is not present in jersey-core-1.9.jar
version. This causes runtime exceptions when I try to run my application in spark cluster.
I tried to explicitly refer to javax.ws.rs-api-2.0.1.jar
as dependency in my client jar to set the linking happen properly in my client jar, but still no use. It links back to the javax.ws.rs.core.MultivaluedMap
from jersey-core-1.9.jar
which causes MethodNotFound
runtime exception. Any suggestions to solve this?
Do I have to make my client to use com.sun.jersey
? In this case, I have few other projects which make use of glassfish jersey. So I suspect this will error out there.
edit:
My application pom has this dependency
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-spark</artifactId>
<version>4.6.0-HBase</version>
<scope>provided</scope>
</dependency>
My client jar that I am trying to use in my spark application has this dependency,
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.17</version>
</dependency>
Hence I need a proper way of fixing this. Kindly provide your suggestions.