I have been investigating the experimental Akka Persistence Query module and am very interested in implementing a custom read journal for my application. The documentation describes two main flavors of queries, ones that return current state of the journal (e.g CurrentPersistenceIdsQuery
) and ones that return a subscribe-able stream that emit events as the events are committed to the journal via the write side of the application (e.g. AllPersistenceIdsQuery
)
For my contrived application, I am using Postgres and Slick 3.1.1 to drive the guts of these queries. I can successfully stream database query results by doing something like:
override def allPersistenceIds = {
val db = Database.forConfig("postgres")
val metadata = TableQuery[Metadata]
val query = for (m <- metadata) yield m.persistenceId
Source.fromPublisher(db.stream(query.result))
}
However, the stream is signaled as complete as soon as the underlying Slick DB action is completed. This doesn't seem to fulfill the requirement of a perpetually open stream that is capable of emitting new events.
My questions are:
- Is there a way to do it purely using the Akka Streams DSL? That is, can I sent up a flow that cannot be closed?
- I have done some exploring on how the LevelDB read journal works and it seems to handle new events by having the read journal subscribe to the write journal. This seems reasonable but I must ask - in general, is there a recommended approach for dealing with this requirement?
- The other approach I have thought about is polling (e.g. periodically have my read journal query the DB and check for new events / ids). Would someone with more experience than I be able to offer some advice?
Thanks!