I'm developing a website for a one on one webrtc video call on mobile browsers. I also support the capturing of photos on the local stream. For the photo capture I use ImageCapture
api where available, and drawing to canvas elsewhere.
The problem is that I want the video to be HD resolution 1280x720 (to reduce the bandwith needed for the call) while I need to capture the photos at 1920x1080.
What I do now is that before I capture the photo I apply constraints and set the resolution to Full HD and after capturing it I set the resolution back to HD. What happens sometimes is because of the change in resolution, the photos are not focused since the camera has to refocus.
So if I want higher resolution photos I see 2 options, of which I already tried the first one:
- I tried having 2 separate
MediaStreamTracks
, one with resolution of 1280x720 that I was sending through the WebRTC connection, and one with resolution of 1920x1080 that I used to display it locally and capture the photos. This worked well on most phones, but there were some phones where one track had the video but the second track didn't work at all (it didn't display any video) (specifically iPhone 6s) - If possible I would only use one
MediaStreamTrack
with the resolution of 1920x1080 and I would limit the video size within the connection itself so it would send the lower resolution through the WebRTC connection.
So my question is, is it possible to locally use higher resolution video and then limit the video size sent through WebRTC connection to reduce bandwith usage?