We're using Serilog HTTP sink to send the messages to Logstash. But the HTTP message body is like this:
{
"events": [
{
"Timestamp": "2016-11-03T00:09:11.4899425+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}
},
{
"Timestamp": "2016-11-03T00:09:12.4905685+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}
}
]
}
ie. the logging events are batched in an array. It is possible to send the messages one by one, but it's still a one-item array then.
The event is then displayed in Kibana as having field message
with value
{
"events": [
{
// ...
},
{
// ...
}
]
}
ie. literally what came from the HTTP input.
How can I split the items in the events
array to individual logging events and "pull up" the properties to the top level so that I would have two logging events in ElasticSearch:
"Timestamp": "2016-11-03T00:09:11.4899425+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}
"Timestamp": "2016-11-03T00:09:12.4905685+01:00",
"Level": "Debug",
"MessageTemplate": "Logging {@Heartbeat} from {Computer}",
"RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
"Properties": {
"Heartbeat": {
"UserName": "Mike",
"UserDomainName": "Home"
},
"Computer": "Workstation"
}