0

I'm trying to send data to Elasticsearch using logagent but while there doesn't seem to be any error sending the data, the index isn't being created in ELK. I'm trying to find the index by creating a new index pattern via the Kibana GUI but the index does not seem to exist. This is my logagent.conf right now:

input:
#  bro-start:
#    module: command
#    # store BRO logs in /tmp/bro in JSON format
#    command: mkdir /tmp/bro; cd /tmp/bro; /usr/local/bro/bin/bro -i eth0 -e 'redef LogAscii::use_json=T;'
#    sourceName: bro
#    restart: 1
  # read the BRO logs from the file system ...
  files:
      - '/usr/local/bro/logs/current/*.log'
parser:
  json:
    enabled: true
    transform: !!js/function >
      function (sourceName, parsed, config) {
        var src = sourceName.split('/')
        // generate Elasticsearch _type out of the log file sourceName
        // e.g. "dns" from /tmp/bro/dns.log
        if (src && src[src.length-1]) {
        parsed._type = src[src.length-1].replace(/\.log/g,'')
        }
        // store log file path in each doc
        parsed.logSource = sourceName
        // convert Bro timestamps to JavaScript timestamps
        if (parsed.ts) {
        parsed['@timestamp'] = new Date(parsed.ts * 1000)
        }
       }
output:
  stdout: false
  elasticsearch:
    module: elasticsearch
    url: http://10.10.10.10:9200
    index: bro_logs

Maybe I have to create the index mappings manually? I don't know.

Thank you for any advice or insight!

David Hoelzer
  • 15,862
  • 4
  • 48
  • 67
V. Zed
  • 121
  • 9

1 Answers1

0

I found out that there actually was an error . I was trying to send some authentication via a field called "auth" but that doesn't exist. I can do url: https://USERNAME:PASSWORD@10.10.10.10:9200 though.

V. Zed
  • 121
  • 9