Given a raw audio file on an android device, my task is as follows:
Stream the file (with a latency of up to 3 seconds): from the android device to a NodeJS server, the server will then stream it to another client (browser) which can then play it "live" through an audio component. It is important that a timestamp is received along with an id number with each stream from the android (assuming multiple files at different times).
In addition, all streamed data through the server will be stored in a mongoDB, with said timestamp and id, which then through the browser, can be played at a later time.
There are several things to consider here.
As the original audio file is in raw format, it must be converted into a playable format along the way to the browser. Currently what I have working is the android app (using retrofit) sending chunks of the audio file in byte array format to the server. The server then converts this chunk into a wav file, which I can then technically stream each file to the browser. But is this appropriate?
What is the correct protocol to achieve what I want?