0

I have problem with combination of Oracle SQL statement and logstash' automatic escaping characters like \ and ".

In SQL statement I have defined i.e.:

to_char(milliseconds_to_date(m.originationtime),'yyyy-mm-dd\"T\"hh24:MI')

But in elastic search it is saved as:

"received":"2016-01-05\T\18:46"

Same problem I have with range of values generated by listagg:

'{' || (select listagg('\"' || cd.name || '\"' || ':'|| '\"' || cd.DATAVALUE || '\"', ', ') within group (order by cd.oid) from customdata cd where cd.messageoid = m.oid) || '}' as MsgAtt

The output I am getting is like this:

"msgatt":"{\\\"CycloneIntegrationRegion\\\":\\\"AMIS\\\", \\\"ToSystem\\\":\\\"FW\\\", ...

I have tried to inspirate by the solution from case 22534325

filter { 
    mutate { 
        gsub => [
            "msgatt","[\\\\\\]", "",
            "received","[\\]", "",
            "delivered","[\\]", ""
        ] 
    }
}

But with no full success. Timestamps are resolved but the msgatt still contains on backslash.

"msgatt":"{\"CycloneIntegrationRegion\":\"AMIS\", \"ToSystem\":\"FW\", ... Any idea how to handle this matter?

Many thanks, Regards, Rudo

Community
  • 1
  • 1
Reddy SK
  • 1,334
  • 4
  • 19
  • 27

1 Answers1

0

Looks like a known issue with no good workaround.

Alain Collins
  • 16,268
  • 2
  • 32
  • 55