I am trying to setup solr to use with postgres db which I use via flask sqlalchemy orm. I found the library pysolr for the purpose but it is not clear how to setup hooks within the sqlalchemy models to update solr index. Are there any examples?
pysolr suggests inserting documents manually, via solr.add, but it's not clear how you would separate indices for different database tables.
after doing some research I came up with the following approach, I am wondering if this is right way to go:
in the ORM models, hook after_insert, after_update, after_remove and after_commit and insert/update/remove the object data in solr in these events.
to segregate data of different models use the table name as prefix in the "id" field of solr documents. solr_id = db_table_name + db_id
when you do a search, get all the results, filter manually those matching the db table required, extract the ids, lookup the db against those ids and use those db results.
is there a better way to about doing this? thanks.