0

I am using python with ecs_logg https://www.elastic.co/guide/en/ecs-logging/python/current/installation.html. It output to a file.

Then I am having a logstash reading the logs. Here is an example of the log

{"@timestamp":"2022-03-31T11:55:49.303Z","log.level":"warning","message":"Cannot get float field. target_field: fxRate","ecs":{"version":"1.6.0"},"log":{"logger":"parser.internal.convertor","origin":{"file":{"line":317,"name":"convertor.py"},"function":"__get_double"},"original":"Cannot get float field. target_field: fxRate"},"process":{"name":"MainProcess","pid":15124,"thread":{"id":140000415979328,"name":"MainThread"}},"service":{"name":"Parser"},"trace":{"id":"264c816a6cdd1f92a26dfad80bdc3e91"},"transaction":{"id":"a8a1ed2ab0b38ca0"}}

Here is the config of my logstash:

input {
    file {
        path => ["/usr/share/logstash/logs/*.log"]
        type => "log"
        start_position => "beginning"
    }
}

filter {
    json {
        # Move keys from 'message' json log to root level
        source => "message"
    }
    mutate {
        id => "Transform"
        # Define the environment such as dev, uat, prod...
        add_field => {
            "environment" => "dev"
        }
        # Rename 'msg' key from json log to 'message'
        rename => {
            "msg" => "message"
        }
        # Add service name from `tag`
        copy => {
            "tag" => "service.name"
        }

    }

}

It seems that the logstash didn't index the field and insert into the ELK. As a result the transaction id didn't extracted out and the APM cannot correlated with the logs.

I would like to ask what is the missing part in the logstash config? and how to activate the log correlation.

Thanks

Hi @Colton,

Thanks for your reply, I have a screen shot here and try to clarify the issue.

I see that the document is there. transaction and trace id are there also.

enter image description here

I can also see that types are also exist:

enter image description here

I want to show logs on the APM page:

enter image description here

After searching the apm index, I see for example :

enter image description here

This id exist on both log And I search this transaction id from APM, I can see it there

enter image description here

Index management

enter image description here

sflee
  • 1,659
  • 5
  • 32
  • 63
  • Can you clarify what happens with the document in Elasticsearch? Maybe show a sample document? Is logstash not creating a document at all, or is the document wrong in some way? – Colton Myers Apr 01 '22 at 14:53
  • I included more pictures in the question. Thanks – sflee Apr 03 '22 at 12:44
  • So the document is there, has all the log correlation fields... what's the behavior you're expecting? From your screenshots it looks like everything is working as designed. – Colton Myers Apr 04 '22 at 16:55
  • Hi, I updated the question, I would like to show the logs on the APM page not just only pushing logs and view it on discovery. Thanks – sflee Apr 06 '22 at 12:00
  • Thanks for being patient with all my clarifications! The next thing to check is whether you have logs that *match* with a transaction ID that's also in Elasticsearch. So, pick a log, grab the `transaction.id`, and search for a transaction document in the APM index with that `transaction.id`. I just want to make sure all the data is in an matches before we try to debug why it's not showing up in the APM section of kibana. – Colton Myers Apr 06 '22 at 16:09
  • Hi @ColtonMyers, thank you so much for you help. It is a good way for trouble shooting, thank you so much for your time. I added the screen capture, hopefully I didn't misunderstood the index you are referring to. – sflee Apr 08 '22 at 10:54
  • 1
    You did screenshot a transaction ID, but the important thing is to find a *matching* ID in one of your log documents. The APM UI can only bring in logs with a matching transaction ID, so I wanted to make sure there was one to find. Does that make sense? – Colton Myers Apr 11 '22 at 15:18
  • Hi @ColtonMyers, I searched from the APM and I see the transaction over there. Is that a correct place for me to confirm that ? Thanks – sflee Apr 17 '22 at 11:07
  • 1
    Another thing to check is that you have your logs in the index configured in your logs UI. The APM app doesn't check all indices for logs. – Colton Myers Apr 19 '22 at 17:28
  • I am not very sure where is the index configuration you are referring to, I added a new screen capture with the index management page, is that what you are looking for ? Thanks – sflee Apr 24 '22 at 08:28
  • 1
    Go to the logs section of the Kibana UI, and then go to Settings at the top. Here's a screenshot: https://capture.dropbox.com/AERT8XOwRjlCY4KH -- you need to make sure your log index is included in "Log Indices" – Colton Myers Apr 25 '22 at 16:24
  • I'll add that comment as an answer, then you can mark it. Thank you! – Colton Myers May 01 '22 at 04:17

2 Answers2

1

In order for the APM app to pick up the logs in Kibana, you have to make sure the index which stores your logs is configured in the logs UI in Kibana.

Go to the logs section of the Kibana UI, and then go to Settings at the top. Here's a screenshot:

Logs UI in Kibana

You need to make sure your log index is included in "Log Indices".

Colton Myers
  • 1,321
  • 9
  • 12
1

try to replace trace.id with trace_id when you store your log into elasticsearch,ie parse log fields with trace_id.

focus zheng
  • 345
  • 1
  • 4
  • 12