I am looking for a way to sync collections in MongoDB with Elastic Search (ES). The goal is to have MongoDB as a primary data source and use MongoDB as a full text search engine. (The business logic of my project is written in python).
Several approaches are online available.
- Mongo-connect
- River plugin
- logstash-input-mongodb (logstash plugin) see similar question
- Transporter
However, most of the suggestions are several years old and I could not find any solution that supports the current version of ES (ES 7.4.0). Is anyone using such a construct? Do you have any suggestions?
I thought about dropping MongoDB as primary data source and just using ES for storing and searching. Though I have read that ES should not be used as a primary data source.
Edit
Thank you @gurdeep.sabarwal. I followed your approach. However, I do not manage to sync the mongodb to ES. My configuration looks like this:
input {
jdbc {
# jdbc_driver_library => "/usr/share/logstash/mongodb-driver-3.11.0-source.jar"
jdbc_driver_library => "/usr/share/logstash/mongojdbc1.5.jar"
# jdbc_driver_library => "/usr/share/logstash/mongodb-driver-3.11.1.jar"
# jdbc_driver_class => "mongodb.jdbc.MongoDriver"
# jdbc_driver_class => "Java::com.mongodb.MongoClient"
jdbc_driver_class => "Java::com.dbschema.MongoJdbcDriver"
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
# jdbc_driver_class => ""
jdbc_connection_string => "jdbc:mongodb://<myserver>:27017/<mydb>"
jdbc_user => "user"
jdbc_password => "pw"
statement => "db.getCollection('mycollection').find({})"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200/"]
index => "myindex"
}
}
This brings me a bit closer to my goal. However, I get the following error:
Error: Java::com.dbschema.MongoJdbcDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError`
Since, it did not work, I tried also the commented version but did not succeed.