0

I am trying to figure out why argb32_image_mark_rgb24 takes a whole ~25% of execution time in the Mac FreeRDP client. This function is called from CGContextDrawImage, a function which I am calling in my drawRect method. The drawRect code looks like this:

CGContextRef cgContext = [[NSGraphicsContext currentContext] graphicsPort];
CGImageRef cgImage = CGBitmapContextCreateImage(self->bitmap_context);
CGContextClipToRect(cgContext, CGRectMake(rect.origin.x, rect.origin.y, rect.size.width, rect.size.height));
CGContextDrawImage(cgContext, CGRectMake(0, 0, [self bounds].size.width, [self bounds].size.height), cgImage);
CGImageRelease(cgImage);

The bitmap context is created like this:

CGContextRef mac_create_bitmap_context(rdpContext* context)
{
    CGContextRef bitmap_context;
    rdpGdi* gdi = context->gdi;

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    if (gdi->dstBpp == 16)
    {
        bitmap_context = CGBitmapContextCreate(gdi->primary_buffer,
                               gdi->width, gdi->height, 5, gdi->width * 2, colorSpace,
                               kCGBitmapByteOrder16Little | kCGImageAlphaNoneSkipFirst);
    }
    else
    {
        bitmap_context = CGBitmapContextCreate(gdi->primary_buffer,
                               gdi->width, gdi->height, 8, gdi->width * 4, colorSpace,
                               kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
    }

    CGColorSpaceRelease(colorSpace);

    return bitmap_context;
}

gdi->primary_buffer is a software buffer where RDP drawing calls are rendered. Right now, the RDP rendering library supports RGB565, RGB555, and most variants of 32bpp.

From what I understood of the documentation on the API, CGBitmapContextCreate() creates an object which wraps my software buffer, but does not create a copy right away. Copying of the pixels would only occur on the call to CGContextDrawImage().

I would like to understand the following:

What does argb32_image_mark_rgb24 do exactly? Is it performing some sort of conversion from ARGB32 to 3-byte RGB24 pixels? Would adding support for 3-byte RGB24 pixels as a software buffer format allow me to avoid a costly conversion which appears to be happening here?

Otherwise, how could I change the current calls which set a clipping rectangle and then draw using the whole surface to make call which copies from a source rectangle to a destination rectangle? CGContextDrawImage only takes one rectangle, not two.

Thank you!

awakecoding
  • 428
  • 5
  • 15

1 Answers1

0

Converting a whole screen of pixels from one format to another every frame is definitely going to suck down some CPU.

I’m a bit surprised that the destination context is 24-bit, though. It’s been a while since I’ve been down at that level, but in MY day the final screen was usually 32-bit, with 8 bits just ignored but there for laughs.

If you log the depth and color channels and all that of the graphicsPort, what do you see?

Are you in an odd kind of window? What hardware is this on?

There are some tricks you can do in OpenGL where you fill in a texture and then blat that to the screen, but, again, you’d want to make sure the pixel format matches what OpenGL can do natively. My understanding is it can handle some pretty funky formats, though, and it’s a super-fast path.

Maybe check out CGLTexImageIOSurface2D() and IOSurfaceRef. There’s some example code here.

Wil Shipley
  • 9,343
  • 35
  • 59