3

I've built my own custom camera class using AVCaptureSession and a lot of the code from apples AVCam demo app. Basically the function I use to capture my image is verbatim from apple's app, however when I snap a picture, I get a received memory warning in my console. It doesn't happen all the time, but almost always on the first picture. This is the code that is my problem..

- (void)capImage { //method to capture image from AVCaptureSession video feed
    dispatch_async([self sessionQueue], ^{
        // Update the orientation on the still image output video connection before capturing.

        UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];


        [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:(AVCaptureVideoOrientation)orientation];

        // Flash set to Auto for Still Capture
        //[AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];

        // Capture a still image.
        [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *image = [[UIImage alloc] initWithData:imageData];
                [[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
                [self processImage:[UIImage imageWithData:imageData]];
            }
        }];
    });
}

Any ideas why this would be happening?

vinylDeveloper
  • 687
  • 2
  • 7
  • 25

0 Answers0