3

I'm writing an application that uses Apache Spark. For communicating with a client, I would like to use gRPC.

In my Gradle build file, I use

dependencies {
  compile('org.apache.spark:spark-core_2.11:1.5.2')
  compile 'org.apache.spark:spark-sql_2.11:1.5.2'
  compile 'io.grpc:grpc-all:0.13.1'
  ...
}

When leaving out gRPC, everything works fine. However, when gRPC is used, I can create the build, but not execute it, as various versions of netty are used by the packages. Spark seems to use netty-all, which contains the same methods (but with potentially different signatures) than what gRPC uses.

I tried shadowing (using com.github.johnrengelman.shadow) , but somehow it still does not work. How can I approach this problem?

navige
  • 2,447
  • 3
  • 27
  • 53
  • Have you tried forcing the lower Netty version (the one required by Spark)? Perhaps you'll be lucky and gRPC won't use any method that's new in the version it was compiled with... – Tzach Zohar Feb 26 '16 at 19:51
  • I tried this. Unfortunately, as it seems, some methods changed the signature and so it didn't work – navige Feb 26 '16 at 20:06

2 Answers2

2

The general solution to this sort of thing is shading with relocation. See the answer to a similar problem with protobuf dependencies: https://groups.google.com/forum/#!topic/grpc-io/ABwMhW9bU34

nmittler
  • 131
  • 1
  • 3
1

I think the problem is that spark uses netty 4.0.x and gRPC 4.1.0 .

Norman Maurer
  • 23,104
  • 2
  • 33
  • 31
  • Yes, you are right. But can I have both versions being used at the same time? I found on https://github.com/googlegenomics/spark-examples an example where they somehow use Spark and gRPC and I tried to do the same with Gradle, but I didn't manage to get it to work... – navige Feb 26 '16 at 20:05