0

I've configured the dead letter queue in logstash

Logasth.yml

 dead_letter_queue.enable: true

I didn't configure the path, so it uses the default path path.data/dead_letter_queue

Test case:

input{
       // kafka

}

output{
    // elastic
}

I'm consuming payload from Kafka and writing into elastic search. I've sent a huge payload, by default elastic will index only 1000 fields. In this case it's giving 400 error.

"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] has been exceeded"}}}}

Since it's a 400 message, it should go to the DLQ queue, right? But I'm not able to see the messages in that path /usr/share/logstash/data/dead_letter_queue.

Did I miss anything in the DLQ config?

dark ninja
  • 33
  • 5
  • What do you see when running `du -h ./data` from your LOGSTASH_HOME folder? – Val Sep 15 '22 at 05:29
  • I could see this output. `4.0K ./data/queue` `4.0K ./data/dead_letter_queue 4.0K ./data/plugins/inputs/dead_letter_queue/main 8.0K ./data/plugins/inputs/dead_letter_queue 12K ./data/plugins/inputs 16K ./data/plugins 32K ./data` – dark ninja Sep 15 '22 at 07:28
  • 1
    So you have data in the DLQ. Now you need to use the [dlq input](https://www.elastic.co/guide/en/logstash/current/dead-letter-queues.html#processing-dlq-events) to process them – Val Sep 15 '22 at 07:49

0 Answers0