6

Is there a way to capture CAEmitterCells (generated using a CAEmitterLayer) when capturing the ios device screen?
UIGetScreenImage() works, but since it's a private method im not allowed to use it.
UIGraphicsBeginImageContext doesn't seem to work, the particles are simply omitted from the resulting image.

EDIT: Here is the code I'm currently using to capture the view. I'm actually recording a 30-second-long video of the screen, using the code provided by aroth at here. It works by recording 25 images of itself (its a UIView subclass) and its subviews (in our case including the UIView whose layer is the CAEmitterLayer) per second and uses AVAssetWriter to compose the recording.

It's quite a mouthful, so I'll just place the relevant lines here: I ARC-ed the code using the ARC tool in XCode, so the code might be a bit different memory management wise.

- (CGContextRef) createBitmapContextOfSize:(CGSize) size {
    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;

    bitmapBytesPerRow   = (size.width * 4);
    bitmapByteCount     = (bitmapBytesPerRow * size.height);
    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (bitmapData != NULL) {
        free(bitmapData);
    }
    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL) {
        fprintf (stderr, "Memory not allocated!");
        return NULL;
    }

    context = CGBitmapContextCreate (bitmapData,
                                     size.width,
                                     size.height,
                                     8,      // bits per component
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaNoneSkipFirst);

    CGContextSetAllowsAntialiasing(context,NO);
    if (context== NULL) {
        free (bitmapData);
        fprintf (stderr, "Context not created!");
        return NULL;
    }
    CGColorSpaceRelease( colorSpace );

    return context;
}

//static int frameCount = 0;            //debugging
- (void) drawRect:(CGRect)rect {
    NSDate* start = [NSDate date];
    CGContextRef context = [self createBitmapContextOfSize:self.frame.size];

    //not sure why this is necessary...image renders upside-down and mirrored
    CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
    CGContextConcatCTM(context, flipVertical);

    [self.layer renderInContext:context];

    CGImageRef cgImage = CGBitmapContextCreateImage(context);
    UIImage* background = [UIImage imageWithCGImage: cgImage];
    CGImageRelease(cgImage);

    self.currentScreen = background;

    //debugging
    //if (frameCount < 40) {
    //      NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
    //      NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
    //      [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES];
    //      frameCount++;
    //}

    //NOTE:  to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls
    //       'setNeedsDisplay' on the ScreenCaptureView.
    if (_recording) {
        float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0;
        [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)];
    }

    float processingSeconds = [[NSDate date] timeIntervalSinceDate:start];
    float delayRemaining = (1.0 / self.frameRate) - processingSeconds;

    CGContextRelease(context);

    //redraw at the specified framerate
    [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01];
}

Really hope this helps. Thanks for your support!

Zoltán Matók
  • 3,923
  • 2
  • 33
  • 64

1 Answers1

0

Did you try using -[CALayer renderInContext:]?

nielsbot
  • 15,922
  • 4
  • 48
  • 73
  • I think he wants the whole screen image. Ah, so maybe you mean to render the view he wants (which probably contains the CAEmitterLayer). That sounds promising. – David H Aug 08 '12 at 18:07
  • thinking about it further, `UIWindow` is also a `UIView`, and hence has a backing layer--you could render that perhaps. – nielsbot Aug 08 '12 at 19:50
  • My thought was if he can get all else in one image, he might be able to get the CAEmitterLayer to render to a transparent image, them overlay that on the first image to get a composite. You have to jump through hoops to get a CATiledLayer to render (ask me how I know this :-) – David H Aug 08 '12 at 20:16
  • @nielsbot Indeed I did, I updated my question so that it is more clear what I'm aiming for. I did try the UIWindow rendering, didn't help. – Zoltán Matók Aug 08 '12 at 20:50
  • @David H Could you elaborate on rendering the CAEmitterLayer to a transparent image? Specifically, how to do it? :) Thanks for the ideas by the way! – Zoltán Matók Aug 08 '12 at 20:53
  • See this: http://stackoverflow.com/q/3454356/96716 The question is whether it will be transparent or not where there are no emissions. I'm going to guess yes. – David H Aug 08 '12 at 21:25
  • @David H The particles are simply not recorded when using [CALayer renderInCntext:], so I cannot capture them this way. – Zoltán Matók Aug 14 '12 at 17:56
  • Yeah, I understand. I tried every trick I could think of to make this work - I even created my own project, tried to drawInContext while the emitter was active. The only (and a real stretch) method would be to try and use OpenGL to read the bits from the screen, if its even possible on iOS (I did this on OSX a few years ago, but my guess is Apple doesn't open this path up on the phone). I saw this was expiring and tried some more tricks this morning to no avail :-( – David H Aug 14 '12 at 18:42
  • It's possible the particle emitter works at an even lower level than the standard Core Animation render pipe.. Even the note in CALayer.h says "WARNING: currently this method does not implement the full CoreAnimation composition model, use with caution." You could write your own emitter layer instead perhaps.. – nielsbot Aug 14 '12 at 20:01