1

The problem: I receive a YUV_420_888 image from an android device as a byte buffer (simply concatenated the image plane buffers). I know the dimension of the image and I need to display it onto my GUI.

What I have so far: At the moment I can use only the grayscale Y-plane with the following function:

private BitmapImage GetImageFromBuffer(byte[] imgBuffer)
{
    Image<Gray, byte> emguImg = new Image<Gray, byte>(1280, 720);
    emguImg.Bytes = imgBuffer;

    var img = new BitmapImage();
    using (MemoryStream ms = new MemoryStream(emguImg.ToJpegData()))
    {
        img.BeginInit();
        img.CacheOption = BitmapCacheOption.OnLoad;
        img.CreateOptions = BitmapCreateOptions.PreservePixelFormat;
        img.StreamSource = ms;
        img.EndInit();
    }
    return img;
}

I also have tested a similar code, using Image.ToBitmap() function and copying the intermediate Bitmap to the memory stream, which serves as source for the BitmapImage. Anyway, I would like to create a BitmapImage of BitmapSource (or any type I can use to display on the GUI) from the incoming byte[]. As far as I could read up on it, I have to create a Mat instance from the byte array, convert it to RGB and then save it to a diplay-able format.

Roland Deschain
  • 2,211
  • 19
  • 50

1 Answers1

1

The following code seems to do the trick:

private BitmapImage GetImageFromBuffer(byte[] imgBuffer)
{
    unsafe
    {
        fixed (byte* p = imgBuffer)
        {
            IntPtr ptr = (IntPtr)p;
            Mat yuvMat = new Mat(1080, 1280, Emgu.CV.CvEnum.DepthType.Cv8U, 1, ptr, 1280);
            Mat rgbMat = new Mat();
            CvInvoke.CvtColor(yuvMat, rgbMat, Emgu.CV.CvEnum.ColorConversion.Yuv420Sp2Rgb);
            var img = new BitmapImage();
            using (MemoryStream ms = new MemoryStream())
            {
                img.BeginInit();
                rgbMat.Bitmap.Save(ms, ImageFormat.Bmp);
                img.CacheOption = BitmapCacheOption.OnLoad;
                img.CreateOptions = BitmapCreateOptions.PreservePixelFormat;
                img.StreamSource = ms;
                img.EndInit();
            }
            return img;
        }
    }
}

Maybe someone can confirm, if this is the way to go or comment suggestions for improvements.

Roland Deschain
  • 2,211
  • 19
  • 50
  • Hey there, based on the code in your question looks like your image resolution is 720p, but in your answer here you use 1080 and 1280 as "rows" and "cols." Is that based on a different image resolution, or is there some calculation you did to get those numbers? – Eternal Ambiguity Jul 03 '21 at 16:31
  • Apologies, I found the answer: apparently one multiplies the image width by 1.5 to get the "1080" value in the function. – Eternal Ambiguity Jul 03 '21 at 16:44
  • 1
    Exactly, this is due to the way YUV image format is layed out :) Here is a nice illustration, that shows why you need to multiply with 1.5: https://stackoverflow.com/questions/27822017/planar-yuv420-data-layout – Roland Deschain Jul 04 '21 at 09:07