XCode has the ability to capture Opengl ES frames from the iPad, and that's great! I would like to extend this functionality and capture an entire Opengl ES movie of my application. Is there a way for that? if it's not possible using XCode, how can i do it without much effort and big changes on my code? thank you very much!
Asked
Active
Viewed 727 times
1
-
"XCode has the ability to capture Opengl ES frames from the iPad" - Are you referring to the new OpenGL ES debugger? That's not really a frame grabber, and you can't extend it to do anything beyond what it currently does. – Brad Larson Apr 16 '12 at 02:26
-
Yes, i'm talking about the GL ES debugger. Strange to know that it doesn't use a frame grabber. Does it simulate all the code, then? anyway, thank you! – tkcast Apr 24 '12 at 23:06
-
It interacts with the OpenGL ES driver to pull render and depth buffer information at various breakpoints you set, along with state and other settings. It's not intended as a capture tool, but as a debugger. You're not going to be able to use it to record your application. – Brad Larson Apr 25 '12 at 16:14
-
Ok, so isn't there any other way to record my app? – tkcast Apr 26 '12 at 20:09
1 Answers
2
I use a very simple technique, which requires just a few lines of code.
You can capture each OGL frame into UIImage using this code:
- (UIImage*)captureScreen {
NSInteger dataLength = framebufferWidth * framebufferHeight * 4;
// Allocate array.
GLuint *buffer = (GLuint *) malloc(dataLength);
GLuint *resultsBuffer = (GLuint *)malloc(dataLength);
// Read data
glReadPixels(0, 0, framebufferWidth, framebufferHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// Flip vertical
for(int y = 0; y < framebufferHeight; y++) {
for(int x = 0; x < framebufferWidth; x++) {
resultsBuffer[x + y * framebufferWidth] = buffer[x + (framebufferHeight - 1 - y) * framebufferWidth];
}
}
free(buffer);
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, resultsBuffer, dataLength, releaseScreenshotData);
// prep the ingredients
const int bitsPerComponent = 8;
const int bitsPerPixel = 4 * bitsPerComponent;
const int bytesPerRow = 4 * framebufferWidth;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(framebufferWidth, framebufferHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
// then make the UIImage from that
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return image;
}
Then you will capture each frame in your main loop:
- (void)onTimer {
// Compute and render new frame
[self update];
// Recording
if (recordingMode == RecordingModeMovie) {
recordingFrameNum++;
// Save frame
UIImage *image = [self captureScreen];
NSString *fileName = [NSString stringWithFormat:@"%d.jpg", (int)recordingFrameNum];
[UIImageJPEGRepresentation(image, 1.0) writeToFile:[basePath stringByAppendingPathComponent:fileName] atomically:NO];
}
}
At the end you will have tons of JPEG files which can be easily converted into a movie by Time Lapse Assembler
If you want to have nice 30FPS movie, hard fix your calc steps to 1 / 30.0 sec per frame.

Split
- 4,319
- 4
- 19
- 10
-
One addition is that you could easily use AVAssetWriter to assemble the JPEGs into a movie. – wfbarksdale Feb 10 '15 at 08:10