I'm writing a client using AsyncHttpClient (AHC) v2.0beta (using Netty 4 as a provider) that streams audio in real-time and it needs to receive server data in real-time too (while streaming). Imagine a HTTP client streaming the microphone's output as the user speaks and receiving the audio transcription has it happens in real time. In short, it's a bidirectional real-time communication over HTTP (chunked multipart request/response).
In order to do that, I had to hack AHC a bit. For instance, there is a blocking call to wait for input data in org.asynchttpclient.multipart.MultipartBody#read(ByteBuffer buffer)
which is implemented on top of Netty's io.netty.handler.stream.ChunkedInput
.
This somewhat works. The problem is that my custom AsyncHandler will not get onBodyPartReceived()
callbacks until the request has finished streaming. They receiving events get pilled up, probably because Netty isn't reading while there is still content to write. Playing with the network stack, I noticed I was only able to receive server responses while streaming if the client was having network contention while writing.
Can someone tell me if this behavior is the result of my particular implementation (blocking in MultipartBody#read()
) or an architectural design constrain imposed by Netty's internal implementation?
As a side note, reading and writing happens inside a single IO thread nioEventLoopGroup-X
.