3

I have a RabbitMQ Queue that ingest some docs that needs to be transferred to Elastic Search.

How do I create the Elastic Search as a consumer, so that ES acts a consumer to the docs Queue.

Possible options :

  1. Using RabbitMQ River ?
  2. Using RabbitMQ Plugin ? (How to do this )
  3. Other ?

Can someone post an example ?

Ap100
  • 231
  • 2
  • 3
  • 5

2 Answers2

4

at your ELK create config file /etc/logstash/conf.d/anyfile.conf

input {
   rabbitmq {
      host => 'rabbit.example.com'
      queue => 'my_queue_name'
      exchange => "my_exchange_name"
      key => 'my_logs'
      durable => true
   }
}
output {
   elasticsearch {
      host => "elk.example.com"
   }
}
Val
  • 207,596
  • 13
  • 358
  • 360
pl_rock
  • 14,054
  • 3
  • 30
  • 33
1

Since you're asking for a RabbitMQ river, here is an example, although you should note that rivers will be deprecated soon (i.e. as of ES 1.5) and the other solution by Prameswar Lal using Logstash will be preferred.

curl -XPOST localhost:9200/_river/custom_river_name/_meta -d '{
    "type" : "rabbitmq",
    "rabbitmq" : {
        "host" : "localhost",
        "port" : 5672,
        "user" : "guest",
        "pass" : "guest",
        "vhost" : "/",
        "queue" : "elasticsearch",
        "exchange" : "elasticsearch",
        "routing_key" : "elasticsearch",
        "exchange_declare" : true,
        "exchange_type" : "direct",
        "exchange_durable" : true,
        "queue_declare" : true,
        "queue_bind" : true,
        "queue_durable" : true,
        "queue_auto_delete" : false,
        "heartbeat" : "30m",
        "qos_prefetch_size" : 0,
        "qos_prefetch_count" : 10,
        "nack_errors" : true
    },
    "index" : {
        "bulk_size" : 100,
        "bulk_timeout" : "10ms",
        "ordered" : false,
        "replication" : "default"
    }
}
Val
  • 207,596
  • 13
  • 358
  • 360