From my camera, I get a byte array. Each two bytes (two elements of the array) define one pixel. If I understand correctly, the most of the current devices support only 256 shades of gray. So, for my image I need only MSB. The problem is that this is a streaming video, and creating a new array for a bitmap made up of every second byte takes a long time. Is it possible to output a "degraded" picture (16 bit -> 8 bit) using some classes?
Edit: If there are class that can create bitmap for grayscale with 16 bit per pixel it also interesting for me