I have a webserver (node) to access Sony Cameras via their API. The webserver is currently setup over the same wifi network as the cameras, but will later on be on its on external webserver. The cameras have been paired via WPS to my wifi router.
I can access my two cameras at the same time, and send requests to take photo, zoom etc. However, my questions are:
What would be the best way to "stream" the liveview from the Sony Camera to my webpage? I've been trying both over socket.io, watching files etc. Kind of similar result. I need best possible performance here, since they are not going to be on the same network later on. Right now I'm saving the image buffer to a file on the server that I watch for changes and then emits event to the webpage to load that image.
When I start liveview from just one camera, it works pretty well (it stops sometimes, but comes back and most of the time. However, when I have two liveviews running on the same page, the update of my image is kind of stopping right away, it is super laggy. Any ideas why?
Thanks!