I have a situation where, I need to certain functionality that is available in Spark library version 1.1.0, But I have two different platforms I need to run this application one. One uses Spark 1.1.0 and the other uses Spark 0.9.1. The functionality available in Spark 1.1.0 is not available in Spark 0.9.1.
That said, is it possible to have some compiler flags in the scala code, so that when compiling with Spark 1.1.0 certain code gets compiled and when compiling using the Spark 0.9.1. library another piece of code gets compiled?
like so :
#ifSpark1.1.0
val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.zipWithIndex()
#endifSpark1.1.0
#ifSpark0.9.1
val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.mapPartitionsWithIndex{case(partId,it) => it.zipWithIndex.map{case(el,ind) => (el,ind+partId*constantLong)}}
#endifSpark0.9.1
Many thanks