1

Given a raw audio file on an android device, my task is as follows:

Stream the file (with a latency of up to 3 seconds): from the android device to a NodeJS server, the server will then stream it to another client (browser) which can then play it "live" through an audio component. It is important that a timestamp is received along with an id number with each stream from the android (assuming multiple files at different times).

In addition, all streamed data through the server will be stored in a mongoDB, with said timestamp and id, which then through the browser, can be played at a later time.

There are several things to consider here.

As the original audio file is in raw format, it must be converted into a playable format along the way to the browser. Currently what I have working is the android app (using retrofit) sending chunks of the audio file in byte array format to the server. The server then converts this chunk into a wav file, which I can then technically stream each file to the browser. But is this appropriate?

What is the correct protocol to achieve what I want?

A S
  • 75
  • 8
  • This all looks very fine to me. Why are you in doubt? – blackapps Jan 10 '21 at 09:41
  • Firstly from further research it appears retrofit is not intended for large continuous sending of data, so I am not sure what is proper for my needs. Secondly, is it possible to stream continuous tiny audio files to the client in the same socket where on the receiving end it will come as a continuous audio file? Or alternatively, can you stream a file from the server which is still being created? That would be assuming I concatenate all the incoming chunks of data on the server into one massive file as I stream it to the browser. – A S Jan 10 '21 at 18:25

0 Answers0