There are probably many ways to do this. One possibility is to use PipedInputStream and PipedOutputStream.
The way this works is that you link an output stream to an input stream such that everything you write to the output stream can be read from the linked input stream, by doing this, creating a pipe between the two of them.
PipedInputStream in = new PipedInputStream();
PipedOutputStream out = PipedOutputStream(in);
There is one caveat, though, according to the documentation of piped streams, the writing process and the reading process must be happening on separate threads, otherwise we may cause a deadlock.
So, coming back to our reactive stream scenario, we can create a pipeline (as mentioned above) and subscribe to the Flux
object and the data you get from it you write it to a piped output stream. Whatever you write there, will be available for reading at the other side of the pipe, in the corresponding input stream. This input stream is the one you can share with your non-reactive method.
We just have to be extra careful that we subscribe to the Flux on a separate thread, .e.g. subscribeOn(Schedulers.elastic())
.
Here's a very basic implementation of such subscriber:
class PipedStreamSubscriber extends BaseSubscriber<byte[]> {
private final Logger logger = LoggerFactory.getLogger(this.getClass());
private final PipedInputStream in;
private PipedOutputStream out;
PipedStreamSubscriber(PipedInputStream in) {
Objects.requireNonNull(in, "The input stream must not be null");
this.in = in;
}
@Override
protected void hookOnSubscribe(Subscription subscription) {
//change if you want to control back-pressure
super.hookOnSubscribe(subscription);
try {
this.out = new PipedOutputStream(in);
} catch (IOException e) {
//TODO throw a contextual exception here
throw new RuntimeException(e);
}
}
@Override
protected void hookOnNext(byte[] payload) {
try {
out.write(payload);
} catch (IOException e) {
//TODO throw a contextual exception here
throw new RuntimeException(e);
}
}
@Override
protected void hookOnComplete() {
close();
}
@Override
protected void hookOnError(Throwable error) {
//TODO handle the error or at least log it
logger.error("Failure processing stream", error);
close();
}
@Override
protected void hookOnCancel() {
close();
}
private void close() {
try {
if (out != null) {
out.close();
}
} catch (IOException e) {
//probably just ignore this one or simply log it
}
}
}
And using this subscriber I could define a very simple utility method that turned a Flux<byte[]
into an InputStream
, somewhat as follows:
static InputStream createInputStream(Flux<byte[]> flux) {
PipedInputStream in = new PipedInputStream();
flux.subscribeOn(Schedulers.elastic())
.subscribe(new PipedStreamSubscriber(in));
return in;
}
Notice that I was extra careful to close the output stream when the flow is done, when error occurs or the subscription is cancelled, otherwise we run the risk of blocking in the read side, waiting for more input to arrive. Closing the output stream is what signals the end of the input stream at the other side of the pipe.
And now that InputStream can be consumed just as any regular stream and therefore you could pass it around to your non-reactive method, e.g.
Flux<byte[]> jedi = Flux.just("Luke\n", "Obi-Wan\n", "Yoda\n").map(String::getBytes);
try (InputStream in = createInputStream(jedi)) {
byte[] data = new byte[5];
int size = 0;
while ((size = in.read(data)) > 0) {
System.out.printf("%s", new String(data, 0, size));
}
}
The code above yields:
Luke
Obi-Wan
Yoda