0

I have a project that I need to stream webcam frames from website (client browser) to a server (cloud server) and get back analytics results back.

The server will get on live the frames process them with light algorithm and will return a data (not an image/video).

On the other hand, I need the server will keep the frames and will make from all the frames a full mp4/webm video file which will be process with much more complicate algorithm at the end of the live recording.

What is the best approach for sending those frames and sending back the analytical results?

Is websocket/socket.io my best solution or there is better solution?

I was looking on GRPC and WEBRTC but didn’t see anything that will help me with that, if there is I didn't able to find the right solution or example for something that close to that.

GRPC – does not support web.

GRPC-WEB – does not support client-side and bidirectional streaming.

WEBRTC – I read that it is not for client to server it is more for client to client, and it there is client to server and back I will be glad to get and example for that or guide that will help me.

1 Answers1

0

I believe you can use WebRTC where one peer is the client and the other peer is your server. For this to work, you'd need an implementation of WebRTC on your server -- your server being a 'media server' here.

I found this guide explains the approach well and offers some open source solutions: https://webrtc.ventures/2017/11/a-guide-to-webrtc-media-servers-open-source-options/. If your server is written in Node, you could use mediasoup, which can be imported as a Node library.

Please take what I've written with a grain of salt! I've only done light research in this problem domain.

Ben Brook
  • 1
  • 2