2

Is it possible in Hadoop to ignore a file and continue reading the others? I am not talking about an exception thrown when a line is corrupted. I am talking about when an exception is thrown by the InputFormat's RecordReader. I want my job to continue and deal with that file later.

Thanks.

danilo
  • 834
  • 9
  • 25
  • I think you can write an wrapper around `RecordReader` and handle the exception as you want. [Hadoop: Custom RecordReader – Processing](https://hadoopi.wordpress.com/2013/05/31/custom-recordreader-processing-string-pattern-delimited-records/). – YoungHobbit Sep 14 '15 at 16:30

0 Answers0