I have a logitech c920
webcam that presents an encoded h264 capture pin
(subtype: MEDIASUBTYPE_H264
).
The h264 pin supports the following resolutions:
FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 640x480 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 160x90 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 160x120 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 176x144 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 320x180 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 320x240 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 352x288 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 432x240 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 640x360 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 800x448 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 800x600 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 864x480 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 960x720 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 1024x576 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 1280x720 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 1600x896 @ 30 fps FORMAT_VideoInfo - subtype: MEDIASUBTYPE_H264 1920x1080 @ 30 fps
In my testing it always uses 3 Mbits regardless of the selected resolution. It is a bit too much, especially for the lower resolutions.
Any idea on how to programatically select the desired bitrate using directshow? Or maybe other API ?