2

below is my configuration file for filebeat which is present in /etc/filebeat/filebeat.yml, it throws an error of

Failed to publish events: temporary bulk send failure

filebeat.prospectors:
  - paths:
     - /var/log/nginx/virus123.log
   input_type: log
   fields:
      type:virus123
   json.keys_under_root: true


   - paths:
     - /var/log/nginx/virus1234.log
     input_type: log
     fields:
       type:virus1234
     json.keys_under_root: true

setup.template.name: "filebeat-%{[beat.version]}"
setup.template.pattern: "filebeat-%{[beat.version]}-*"
setup.template.overwrite: true

processors:
 - drop_fields:
     fields: ["beat","source"]


output.elasticsearch:
  index: index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

  hosts: ["http://127.0.0.1:9200"]
Virendra Singh
  • 297
  • 3
  • 9
  • what version of Filebeat are you using? – Green Jun 16 '18 at 14:25
  • Also, normally the bulk sense failure is caused by an error on the Elasticsearch side. Knowing which error elasticsearch returns could be helpful here. – Green Jun 16 '18 at 14:30

1 Answers1

1

I think I found your problem, Although i'm not sure it is the only problem

index: index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

should be:

index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

I saw a similar problem with a wrong index which cause the same error that you showed

Green
  • 2,405
  • 3
  • 22
  • 46
  • that was a typo mistake it's not working solution, I go with logagent it's super awesome and lightweight as compared to logstash – Virendra Singh Jul 13 '18 at 07:56