1

I have a webserver (node) to access Sony Cameras via their API. The webserver is currently setup over the same wifi network as the cameras, but will later on be on its on external webserver. The cameras have been paired via WPS to my wifi router.

I can access my two cameras at the same time, and send requests to take photo, zoom etc. However, my questions are:

  • What would be the best way to "stream" the liveview from the Sony Camera to my webpage? I've been trying both over socket.io, watching files etc. Kind of similar result. I need best possible performance here, since they are not going to be on the same network later on. Right now I'm saving the image buffer to a file on the server that I watch for changes and then emits event to the webpage to load that image.

  • When I start liveview from just one camera, it works pretty well (it stops sometimes, but comes back and most of the time. However, when I have two liveviews running on the same page, the update of my image is kind of stopping right away, it is super laggy. Any ideas why?

Thanks!

nickelman
  • 702
  • 6
  • 24
  • Sorry, I am just seeing this. I am actually wondering how you are able to make this work. Officially the camera API camera's produce their own server that you have to connect to in order to send commands. How were you able to bipass this limitation. What type of cameras are you using? – pg316 Jan 31 '18 at 23:51
  • @Robert-Sony im using ilce-qx1 cameras, and via wps i paired 4 cameras to my router, and then were able to communicate them all over same network/access point :) – nickelman Mar 01 '18 at 20:11

0 Answers0