The cleanest solution is to use side effects off the main stream to update a dictionary and merge in the unfinished events to new subscribers.
class EventObserver(Observer):
def __init__(self):
self.cached_events = set()
self.mirror = Subject() # re-emits all values
on_next(self, value):
self.mirror.next(value) # stream to late observers
if(value[1] == 'stop'):
try:
self.cached_events.remove(value[0])
except KeyError:
pass
else:
self.cached_events.add(value[0])
on_error(self, e):
self.mirror.error(e) # + other error logic
on_completed(self):
self.mirror.complete() # + other completion logic
late_subscribe(self, subscriber):
return Observable.merge(
Observable.from(list(self.cached_events)),
self.mirror
).subscribe(subscriber)
Used as follows:
event_observer = EventObserver()
events$.subscribe(event_observer)
# late subscription:
event_observer.late_subscribe(...)
The rest of the answer explains why you'll probably prefer this over a reactive approach.
Reactive approach:
Here's the simplest solution I could think of, if you don't mind your late subscribers waiting until the next event. As you can see, it's not the prettiest.
pub_events$ = events$.publish(); # in case your events$ aren't hot
replay_events$ = pub_events$.replay();
# late subscription:
replay_events$.window(events$.take(1))
.scan(lambda is_first, o:
o.reduce(lambda D, x: D.update({ x[0]: x[1] == 'stop' }) or D, {})
.flatMap(lambda D: Observable.from([ k for k, v in D.items() if v == False ]))
if is_first == True else o,
True)
.flatMap(lambda o: o)
The goal is to start the late subscription with a filtered list of unfinished events built from a cache of all prior events. The biggest barrier is that ReplaySubject
does not differentiate these cached events from new ones. The first step to tackle this above is to window
on the next event, expecting ReplaySubject
to emit the cached events before then. Since your requirement sounds like an optimization rather than correctness, the race condition here may not be a big deal.
There are at most two windows: one of the cached events, and one of the new events (if there are any), so scan
exploits Python type weakness a bit to check which window we're in. If it's the cached events, we build a dictionary of event keys → whether or not that event is "stopped". The last step is to inject the unstopped values back into the stream with a flatMap
.