0

I have a small piece of C# code that uses a Kinect to detect up to 4 glyphs and draws and polygon between them on a canvas, as seen here:

enter image description here

I've tried to follow along to this in order to implement 2D augmented reality and project an image within the created polygon. I've read in a source image and tried to apply the BackwardQuadrilateralTransformation to it but can't seem to display the transformed image. I am probably using the wrong method but I have tried to convert the image and paint it onto a canvas with no luck. I'm not sure if I'm just massively misunderstanding the method and maybe it isn't possible, any help would be greatly appreciated. I can supply more sample code if required.

    private void GlyphBackQuad(List<IntPoint> quadpoints)
    {
        Bitmap srcImage = new Bitmap( // my sample image filepath );
        UnmanagedImage sourceImage = UnmanagedImage.FromManagedImage(srcImage);
        BackwardQuadrilateralTransformation filter = new BackwardQuadrilateralTransformation(sourceImage, quadpoints);
        filter.Apply(sourceImage);

        Bitmap bmp = sourceImage.ToManagedImage();
        ImageBrush ib = new ImageBrush();
        ib.ImageSource = ConvertDrawingImage2MediaImageSource(bmp);
        PolyCanvas.Background = ib;
    }

After a further play around with the code I think I have developed a partial solution, it still needs some work but hopefully this is more helpful for people to read/debug.

    private void GlyphBackQuad(List<IntPoint> quadpoints, Bitmap bmp)
    {
        // Read in bitmap source image and clone it to the same format as destination
        Bitmap srcImage = AForge.Imaging.Image.Clone(new Bitmap( // my sample filepath), System.Drawing.Imaging.PixelFormat.Format24bppRgb);
        System.Drawing.Imaging.BitmapData bitmapData = bmp.LockBits(new System.Drawing.Rectangle(0, 0, bmp.Width, bmp.Height), System.Drawing.Imaging.ImageLockMode.ReadWrite, System.Drawing.Imaging.PixelFormat.Format24bppRgb);
        // Convert to unmanaged image
        UnmanagedImage unmanagedImage = new UnmanagedImage( bitmapData );
        // Filter
        BackwardQuadrilateralTransformation filter = new BackwardQuadrilateralTransformation(srcImage, quadpoints);
        filter.ApplyInPlace(unmanagedImage);
        // Convert back to managed image and save
        Bitmap managedImage = unmanagedImage.ToManagedImage();
        managedImage.Save( // my save filepath, System.Drawing.Imaging.ImageFormat.Png);
    }
Taylan Aydinli
  • 4,333
  • 15
  • 39
  • 33
Cryptomnesia
  • 21
  • 1
  • 4
  • Why do you need to pass the `sourceImage` twice? How does the result looks like? – Jeroen van Langen Oct 03 '13 at 17:20
  • I think you need to `Apply()` a new image (or something like you sprite) that has to be transformed into the quad. – Jeroen van Langen Oct 03 '13 at 17:28
  • @JeroenvanLangen I think you were right that I was using apply in the wrong way. I've managed to capture bitmaps with the sample overlaid/skewed properly but I can't seem to update my video source after converting a managed bitmap to image source. – Cryptomnesia Oct 04 '13 at 13:08

0 Answers0