How can i convert multi dimensional SQL json string to valid JSON object so that it can be mapped to Elasticsearch index template field types.
Example case:
Let's assume that this is my Elasticsearch index template.
{
"settings": {
"number_of_shards": "5"
},
"mappings": {
"_doc": {
"properties": {
"manager": {
"properties": {
"age": {
"type": "integer"
},
"name": {
"type": "text"
}
}
},
"employees": {
"type": "nested",
"properties": {
"age": {
"type": "integer"
},
"name": {
"type": "text"
},
"addresses": {
"type": "nested",
"properties": {
"street": {
"type": "text"
},
"city": {
"type": "text"
}
}
}
}
}
}
}
}
}
Problematic case is "employees" array and it's inner "addresses" array JSON conversion.
Example SQL data for "employees":
[{"age":30,"name":"User 1","addresses":"[{\"street\":"Home street 1",\"city\":"New York"},{\"street\":"Home street 2",\"city\":"Washington"}]"}]
Logstash filter configuration: Conversion works if "employees" does not contain nested array. But problem comes when i array has nested array.
filter {
ruby {
code => "
require 'json'
employees_json_value = JSON.parse(event.get('employees').to_s)
event.set('employees',employees_json_value )
"
}
}
Question is how can i convert "employees" column nested "addresses" array to JSON object so that i can map it to Elasticsearch index template.