I need to send some video data from a drone's camera to a server and then on to a client web page. I am working on an Android app that interacts with the drone - it successfully gets the drone's camera output and renders it. I already have a Java server (using Java Lightweight HTTP Server) working with the client web page that could be used for this purpose. I have come up with two possible methods:
h264 and Wowza
I have a callback method available from the drone's camera that provides a h264 encoded byte array. This should be streamable via RTP, however I haven't found any Android/Java libraries to support streaming this from the source I have available (libstreaming only supports streaming from the camera afaik). If it is possible to stream the incoming h264 byte arrays to Wowza, I could then figure out how to access Wowza from the client and render the stream. This method would not need to use the existing server.
Sampling Bitmaps - The camera output is rendered onto a Texture View in the Android app. This can be sampled to produce Bitmap images, which could be encoded as Base64 and sent to the server. The server then passes the latest Bitmap onto the client and the client renders the Bitmap using the encoded Base64:
document.getElementById('drone_feed')
.setAttribute(
'src', 'data:image/png;base64,' + cameraBase64String
);
I have successfully implemented the second method - the Texture View is sampled every 100ms and the client fetches the current encoded image from the server every 100ms. Obviously this method gives very poor latency and low FPS, but it does work.
So I have two questions:
1. Is the first method (h264 and Wowza) possible, and if so how do I go about streaming the incoming h264 video buffers to Wowza?
2. If the first method is not possible, can I make the second method more efficient, or is there an alternative method I have not considered?