0

I would like to log an error on exception and continue on next record/split, but it does not work.

I tired OnExcepiton(), doTry() DSL but it does not work and goes to ErrorHandler.

onException(IOException.class)
.handled(true).process(exchange -> log.error("error!!"));

from("file:" + rootDir + "/" + account + "/inbox/?move=.done")
.unmarshal(csvDataFormat)
.split(body()).shareUnitOfWork().parallelProcessing().streaming()
.process(fileService)
.end()

Logs:

2018-07-18 14:01:59.883 DEBUG 45137 --- [/test1/request/] o.a.camel.processor.MulticastProcessor   : Parallel processing failed due IOException reading next record: java.io.IOException: (line 4) invalid char between encapsulated token and delimiter
2018-07-18 14:01:59.885 ERROR 45137 --- [/test1/request/] o.a.camel.processor.DeadLetterChannel    : Failed delivery for (MessageId: ID-**********-local-1531936914834-0-3 on ExchangeId: ID-*********-local-1531936914834-0-4). On delivery attempt: 0 caught: java.lang.IllegalStateException: IOException reading next record: java.io.IOException: (line 4) invalid char between encapsulated token and delimiter
Ronak Patel
  • 3,819
  • 1
  • 16
  • 29
  • Is `IOException` thrown in `fileService`, or in `file2` component? If you want to catch exception thrown from `file2` component, you need to set URI attribute `consumer.bridgeErrorHandler` to true – Bedla Jul 18 '18 at 17:37
  • `IOException` is thrown in `.unmarshal(csvDataFormat)` - when record is not invalid – Ronak Patel Jul 18 '18 at 17:54
  • added logs - it seems exception is thrown in parallel processing – Ronak Patel Jul 18 '18 at 18:07
  • I have tried it with some invalid CSV and `IOException` is not thrown directly in Camel 2.21.0, it is wrapped with `RuntimeException` in `org.apache.commons.csv.CSVParser#getNextRecord`. Does `onException(RuntimeException.class)` work? – Bedla Jul 18 '18 at 18:21
  • I tried, it does not continue to process next rec(split): `onException(RuntimeException.class).log("error!!!").handled(true);` – Ronak Patel Jul 18 '18 at 18:44
  • But the exception is probably thrown in `unmarshal` row, before the spliting happens (you can test it adding `.log("something")` right after `.unmarshal()`). You can marshal in on row-by-row basis. `from(...).split(body().tokenize("\n")).shareUnitOfWork().parallelProcessing().streaming().unmarshal(csvDataFormat).process(...)...` – Bedla Jul 18 '18 at 19:07
  • I tired multiple way, File ends up in DLC, with `Caused by: No type converter available to convert from type: java.lang.Byte to the required type: java.io.InputStream with value 116.` – Ronak Patel Jul 20 '18 at 21:38
  • OK. Still I think, it should work. Attaching my unit test, maybe this can help you find the real cause. https://pastebin.com/EXYWkx2x – Bedla Jul 20 '18 at 21:59
  • so `CsvDataFormat csvDataFormat = new CsvDataFormat().setLazyLoad(true).setUseMaps(true);`, your example works, but with this format, using maps and header with `tokenize("\n")` I get empty body (may be because of streaming line by line format don't get the header), After defining header in format it works, but I wanted to give client flexibility to send csv fields in any order, (reason using map in format is, need for converting csv record to json) - Thank you – Ronak Patel Jul 21 '18 at 12:50
  • @Bedla - would `setBody` on each split would impact performance of huge file processing - in my answer below? – Ronak Patel Jul 21 '18 at 15:11

1 Answers1

0

@Bedla, Thank you for your input, I found this working for my UseCase,

  • Using onException() was still sending exchange to DeadLetterChannel, so had to use doTry()
  • CasvFormat with using maps - I couldn't modify csvFormat in process, so had to read header from file and append csv header in body on each split using setBody

Full Route Definition:

CsvDataFormat csvDataFormat = new CsvDataFormat().setUseMaps(true);

from("file:" + rootDir + "/test/")
                .log(LoggingLevel.INFO,"Start processing ${file:name}")
                .unmarshal().pgp(pgpFileName,pgpUserId,pgpPassword)
                .process(exchange -> { /* just to get csv header */
                    InputStream inputStream = exchange.getIn().getBody(InputStream.class);
                    try(BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream))){
                        String header = bufferedReader.readLine();
                        exchange.getIn().setHeader("CSV_HEADER",header);
                        csvDataFormat.setHeader(header.split(",")); //<- this does not work, so had to add in body below!
                        System.out.println("csvHeader is : " + header);// + " ? " + Arrays.asList(csvDataFormat.getHeader()));
                    }
                })
                .split(body().tokenize("\n")).shareUnitOfWork()
                .parallelProcessing().streaming()
                .setBody(exchange -> exchange.getIn().getHeader("CSV_HEADER") + "\n" + exchange.getIn().getBody())
                .doTry()
                  .unmarshal(csvDataFormat)
                  .process(requestFileService)
                .doCatch(IOException.class)
                  //TODO: custom processing here...
                  .process(exchange -> log.error("caught in dotry: " + exchange.getIn().getBody())).stop()
                .end()//end try/catch
                .choice()
                    .when(simple("${property." + Exchange.SPLIT_COMPLETE + "} == true"))
                    .log(LoggingLevel.INFO, "Finished processing ${file:name}")
                .end();
Ronak Patel
  • 3,819
  • 1
  • 16
  • 29