-1

i m sending Nginx logs via filebeat -> elastic search -> Kibana . But already have issue with some logs .

It s look like this type of log is parsing without any problem :

66.249.76.123 - - [24/Apr/2020:17:24:51 +0200] "GET / HTTP/1.1" 200 5249 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

but on the other hand similar log :

62.197.243.55 - - [24/Apr/2020:17:29:22 +0200] "GET / HTTP/1.1" 200 5252 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36"

throwing error to syslog -

Apr 24 17:29:31 prodserver filebeat[12562]: 2020-04-24T17:29:31.497+0200#011WARN#011elasticsearch/client.go:517#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbfa0df569acc421c, ext:321343689727, loc:(*time.Location)(0x5003080)}, Meta:{"pipeline":"filebeat-7.6.2-nginx-access-default"}, Fields:{"agent":{"ephemeral_id":"3d9ae7ae-c460-4e7b-b994-f10a681cc10b","hostname":"prodserver","id":"58d1eb1d-9c09-485d-ad7f-28b0066a0054","type":"filebeat","version":"7.6.2"},"ecs":{"version":"1.4.0"},"event":{"dataset":"nginx.access","module":"nginx","timezone":"+02:00"},"fileset":{"name":"access"},"host":{"name":"prodserver"},"input":{"type":"log"},"log":{"file":{"path":"/var/log/nginx/denevy.access.log"},"offset":483636},"message":"62.197.243.55 - - [24/Apr/2020:17:29:22 +0200] \"GET / HTTP/1.1\" 200 5252 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36\"","service":{"type":"nginx"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc0008c4d00), Source:"/var/log/nginx/denevy.access.log", Offset:483837, Timestamp:time.Time{wall:0xbfa0df0e51ae7c7f, ext:32190743674, loc:(*time.Location)(0x5003080)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x2466a, Device:0xfc00}}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document with id 'ZJ_OrHEBZWkJKYxN4WlY'. Preview of field's value: '80.0.3987.163'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [80.0.3987.163] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}

problem starts :

{"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document with id 'ZJ_OrHEBZWkJKYxN4WlY'. Preview of field's value: '80.0.3987.163'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [80.0.3987.163] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}

any idea why 99% of my logs doing well , but type with chrome : "(KHTML, like Gecko) Chrome/" - have issue with parse failed to parse field [user_agent.version]

Using Filebeat/Elasticsearch/Kibana - version 7.6.2

Thanks in advice

1 Answers1

0

Are you using the standard Filebeat template or are you using your own Template?

I had identical problem because I was using my own template that did not have the "user_agent" fields defined. I added them by copying them from the Filebeat template and for the moment it did not give me any problems.

JavierR
  • 1
  • 1