0

I'm trying to use the Splitter app to split a JSON array e.g. [{...},{...}] into multiple messages {...} {...}. With input contentType=application/json (per the docs) Spring Cloud is surfacing an exception from Jackson:

com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.lang.String out of START_ARRAY token at [Source: [B@163b1945; line: 1, column: 1]

Unit tests showed I have the correct pattern for the split to work:

splitter.expression=#jsonPath(payload,'$.[*]')

This worked for me in Spring XD 1.3. How should Spring Cloud (or Splitter) be configured to handle this case? Input and output are both Kafka strings (no headers).

  • I can also add that using input.contentType=text/plain produces this somewhat arcane exception: Exception thrown when sending a message with key='null' and payload='{-1, 5, 13, 99, 111, 114, 114, 101, 108, 97, 116, 105, 111, 110, 73, 100, 0, 0, 0, 38, 34, 102, 55, ...' – Kevin Niemann Feb 14 '17 at 21:57
  • Are you sure you are using data flow stream in this case? or, do you mean just a Spring Cloud stream app that receives data from kafka topic? – Ilayaperumal Gopinathan Feb 15 '17 at 18:43

1 Answers1

0

If the messages to the splitter come from non Spring Cloud Stream applications, then you would need to set --spring.cloud.stream.bindings.<inputChannelName>.consumer.headerMode=raw. While the application/json contentType messages have jackson exception, this would at least get the text/plain contentType messages evaluated against the expression appropriately.

Ilayaperumal Gopinathan
  • 4,099
  • 1
  • 13
  • 12
  • On the send I am now getting `failed to send Message to channel 'output'; nested exception is java.lang.IllegalArgumentException: payload must not be null`. I turned on the debug logs, there is definitely something in the payload: – Kevin Niemann Feb 16 '17 at 00:50
  • `DEBUG 1311 --- [afka-listener-1] o.s.integration.channel.DirectChannel : preSend on channel 'output', message: GenericMessage [payload={"some": "data"}, headers={sequenceNumber=1, kafka_offset=2663, sequenceSize=1, correlationId=f9c64983-5908-af0c-3b2c-bb9c4bb03e86, id=07 3d1d21-8474-bdf0-46c8-e4a71e2117b2, kafka_receivedPartitionId=0, kafka_receivedTopic=MyTopic, contentType=te xt/plain, timestamp=1487206118408}]` – Kevin Niemann Feb 16 '17 at 00:51
  • Could I be seeing this issue? http://stackoverflow.com/questions/41781351/spring-cloud-dataflow-type-conversion-not-working-in-processor-component – Kevin Niemann Feb 16 '17 at 01:04
  • May be. But, do you have any debug log on why message conversion fails when sending output? Have you tried setting any explicit output contentType? Also, is that a typo on the contentType=te xt/plain above? – Ilayaperumal Gopinathan Feb 16 '17 at 04:51
  • Yes I'm using "text/plain" on both input and output when I get the null payload. What level would I see the message conversion fail on output? I've set app.*.logging.level.org.springframework=DEBUG – Kevin Niemann Feb 16 '17 at 16:33