5

I'm trying to read JSON data from Kafka, using following code:

@source(type = 'kafka', bootstrap.servers = 'localhost:9092', topic.list = 'TestTopic', 
group.id = 'test', threading.option = 'single.thread', @map(type = 'json'))

define stream myDataStream (json object);

But failed with following error:

[2019-03-27_11-39-32_103] ERROR {org.wso2.extension.siddhi.map.json.sourcemapper.JsonSourceMapper} - Stream "myDataStream" does not have an attribute named "ABC", but the received event {"event":{"ABC":"1"}} does. Hence dropping the message. Check whether the json string is in a correct format for default mapping.

I've tried adding the attributes

@source(type = 'kafka', bootstrap.servers = 'localhost:9092', 
topic.list = 'TestTopic', group.id = 'test', 
threading.option = 'single.thread', 
@map(type = 'json', @attributes(ABC = '$.ABC')))

Syntax error:

Error at 'json' defined at stream 'myDataStream', attribute 'json' is not mapped

Any help would be greatly appreciated.

Faiz Fareed
  • 1,498
  • 1
  • 14
  • 31
Manu
  • 75
  • 7

1 Answers1

1

There is an error in the syntax of the stream,

define stream myDataStream (ABC string);

Here the attribute name is the key of the JSON messages, in this case, ABC

Niveathika
  • 1,319
  • 2
  • 8
  • 17
  • Thanks very much. I will have 100s of key/value pairs coming from kafka. So do we have to map all those keys to input stream? Or is there a way to access the JSON as an object and parse required key out of it? something like $.ABC or json.ABC? – Manu Mar 27 '19 at 13:49
  • Yes, it is possible, you can use custom mapping to get the required keys only, @map(type = 'json', @attributes(key1 = '$.ABC.key'))) define stream myDataStream(key1 string); – Niveathika Mar 27 '19 at 14:31
  • Thanks very much, that helped up to an extend. My JSON will be `{"event":{"A":"1","B":"2","C":"3"}}` and added attribute as `@attributes(json = '$.event')`. But the string gets assigned to the variable is not in JSON format, it coverts JSON key=value format, for example `{A=1,B=2,C=3}`. Hence, I'm not able use JSON further using `json:getString(json,"$.A")`. – Manu Mar 27 '19 at 16:43
  • Or is there a way to convert `{A=1,B=2,C=3}` back to `{"A":"1","B":"2","C":"3"}`, in wso2sp or siddhi, please? – Manu Mar 27 '19 at 17:34
  • Correct me if I am wrong, essentially rather than mapping attributes at the source, you want to keep the JSON string as it is throughout the analytics flow? for manipulation downstream? – Niveathika Mar 27 '19 at 18:37
  • I will have 100s of key/value pairs coming in through Kafka source, and I'm trying to avoid mapping them individually at `define stream`. Because some of the data will be used for further processing and some will be ignored. If I get the luxury of handling JSON down the flow, I can just pick up the values I need using `json:getString(json,"$.A")` and process further. – Manu Mar 28 '19 at 04:54
  • Yes in custom mapping, you dont have to map all keys, just the keys you need. That is allowed and use the stream to pass values, However, if you want to pass entire json that also in custom mapping map as @attributes(json = '$') define stream Test(json string). Here the entire JSON is kept as string. However, this may have impact on performance if you pass entire JSONs. Lets move the discussion to Github if you have further doubts it easier for comments :) https://siddhi-io.github.io/siddhi/ – Niveathika Mar 28 '19 at 06:27