4

When you read a file in Spark using sc.textfile, it gives you elements, where each element is a separate line. However, I want each element to consist of N number of lines. I can't use delimiters either because there is none in that file. So, how can I make spark give me multiple line elements?

And I'm interested in doing so using the NLineInputFormat class. Is that possible to do so in Spark? I can see examples of that for MapReduce, but I don't have any clue how that would translate to in Spark.

eliasah
  • 39,588
  • 11
  • 124
  • 154
pythonic
  • 20,589
  • 43
  • 136
  • 219

1 Answers1

2

Yes, if you are getting the files from hadoop. You should be able to do it like this:

val records = sc.newAPIHadoopRDD(hadoopConf,classOf[NLineInputFormat],classOf[LongWritable],classOf[Text])

Here's the API doc.

Mateusz Dymczyk
  • 14,969
  • 10
  • 59
  • 94