In my understanding of spec sheets from digital cameras, each output color pixel is made out 4 real pixels on the CCD. However, when reading from a camera, for example using OpenCV, one can get as much as NxMx3 pixels. The two green pixels get averaged.
For what I understood, OpenCV let's you transform from RGB images to grayscale, but couldn't find a way of getting the raw values from the CCD. Of course, it could be that there is a lower level limitation (i.e. the transformation to color space happens on the electronics and not on the computer). Or that there is some interpolation and hence, there are in reality NxM pixels and not NxMx4 pixels in a camera.
Is there any way of getting RAW data from a camera with OpenCV, or is there any information stored in RAW files acquired with commercial cameras?