0

I have my ELK deployed on an ec2 instance and a dockerized application running on a different instance. I am trying to use gelf to collect the different service logs and send to logstash. But my current configuration doesn't work.

Here's my docker.yaml file and my logstash conf file. For the gelf address I used the private ip of the instance where I have logstash running - is that what I should be using in this use case? What am I missing?

version: '3'
services:
  app:
    build: .
    volumes:
      - .:/app
    ports:
      - "8000:8000"
    links:
      - redis:redis
    depends_on:
      - redis
    logging:
      driver: gelf
      options:
        gelf-address: "udp://10.0.1.98:12201"
        tag: "dockerlogs"
  redis:
    image: "redis:alpine"
    expose:
      - "6379"
    logging:
      driver: gelf
      options:
        gelf-address: "udp://10.0.1.98:12201"
        tag: "redislogs"

This is my logstash conf:

input {
  beats {
    port => 5044
  }
  gelf {
    port:12201
    type=> "dockerLogs"
  }
}
output {
  elasticsearch {
    hosts => ["${ELK_IP}:9200"]
    index =>"logs-%{+YYYY.MM.dd}"
  }
}
Didi
  • 431
  • 1
  • 4
  • 13

1 Answers1

1

Verify the version of docker once and check if the syntax is correct.

Docker resolves gelf address through the host's network so the address needs to be the external address of the server.

Why not directly write to elasticsearch as you are only sending application logs without using logstash filter benefits?

see also: Using docker-compose with GELF log driver

ambergupta09
  • 162
  • 2
  • 11