2

I am writing an application for long exposure image taking.

I used func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) to get a CMSampleBuffer for applying a CIFilter using CILightenBlendMode.

The problem is, that blending takes too long and will cause frames to drop. I tried to copy the Buffer:

var copiedBuffer:CMSampleBuffer?
CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
blendImages(copiedBuffer!)

But that didn't help, frames still drop.

Complete Code:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

    if(CameraService.longExposureRunning){
        var copiedBuffer:CMSampleBuffer?
        CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
        blendImages(copiedBuffer!)
    }
}

func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    print("Dropped")

}


func blendImages(buffer:CMSampleBuffer){

    let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
    dispatch_async(dispatch_get_global_queue(priority, 0)){
        let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)

        let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

        if let backgroundImage = self.lastImage{
            let blendEffect = CIFilter(name: "CILightenBlendMode")
            blendEffect?.setValue(backgroundImage, forKey: kCIInputBackgroundImageKey)
            blendEffect?.setValue(cameraImage, forKey: kCIInputImageKey)
            self.lastImage = blendEffect?.outputImage
            print("Blending")
        }else{
            self.lastImage = cameraImage
        }

        let filteredImage = UIImage(CIImage: self.lastImage!)
        dispatch_async(dispatch_get_main_queue())
        {
            imageView.image = filteredImage
        }
    }
}
ferdyyy
  • 515
  • 4
  • 16

2 Answers2

1

I suspect that CoreImage is concatenating all your frames into one huge kernel. You may find a CIImageAccumulator helps, but I can get your code working by forcing Core Image to render the chain and start over with each frame.

I've changed the type of your lastImage variable than optional UIImage and added a constant named context which is a CIContext. With those in place this works beautifully:

Use: let context:CIContext = CIContext(options: [kCIContextUseSoftwareRenderer:false]) for GPU instead of CPU rendering.

func blendImages(buffer:CMSampleBuffer){

let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)){
  let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)

  let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

  if let backgroundImage = self.lastImage {
    let blendEffect = CIFilter(name: "CILightenBlendMode")!

    blendEffect.setValue(
      CIImage(image: backgroundImage),
      forKey: kCIInputBackgroundImageKey)

    blendEffect.setValue(
      cameraImage, forKey:
      kCIInputImageKey)

    let imageRef = self.context.createCGImage(
      blendEffect.outputImage!,
      fromRect: blendEffect.outputImage!.extent)

    self.lastImage = UIImage(CGImage: imageRef)
    print("Blending")
  }else{
    let imageRef = self.context.createCGImage(
      cameraImage,
      fromRect: cameraImage.extent)

    self.lastImage = UIImage(CGImage: imageRef)
  }

  let filteredImage = self.lastImage
  dispatch_async(dispatch_get_main_queue())
  {
    self.imageView.image = filteredImage
  }
}
}

Funky effect!

Simon

ferdyyy
  • 515
  • 4
  • 16
Flex Monkey
  • 3,583
  • 17
  • 19
  • Thanks I will check on that tomorrow :) – ferdyyy Apr 25 '16 at 08:22
  • So I implemented your method, It is still dropping a lot of frames in my app. But it dropping less frames now. I am getting about every 20th frame to process. How did you create the CIContext? – ferdyyy Apr 26 '16 at 06:59
  • 1
    On my iPad Pro, I had no drops at all. Maybe worth considering using a GPU based target and displaying with a GLKit `GLKView`. In my tests I used a CPU context: `let context = Context()`. – Flex Monkey Apr 27 '16 at 04:48
  • Ah okay, I was always running the app on an iPhone 6s and had drops. But I changed it to GPU rendering now `let context:CIContext = CIContext(options: [kCIContextUseSoftwareRenderer:false])` is the key. – ferdyyy Apr 27 '16 at 07:24
  • Although when trying to process 4k Images it will drop frames again on my iPhone 6s – ferdyyy Apr 27 '16 at 08:14
  • Hi, if you look at my Core Image Helpers repo (https://github.com/FlexMonkey/CoreImageHelpers) I have an alternative to `UIImageView` for displaying `CIImage` using GLKIt - all GPU based. This will offer better performance than rendering to a `CGImage` and *may* help you out. (*may* being the operative word :) ) – Flex Monkey Apr 27 '16 at 08:35
  • Hi, I'm 99% certain that the default is `[kCIContextUseSoftwareRenderer:false]` without having to set it explicitly. – Flex Monkey May 03 '16 at 04:56
  • I am sure it is, too. But it made a huge difference when I set it. – ferdyyy May 03 '16 at 06:25
  • I can't argue with the evidence! Glad you got it working - it's a really interesting effect. – Flex Monkey May 03 '16 at 12:59
0

The most obvious thing I can think of is to check when you setup your outputs.

Make sure that you have expectsDataInRealTime set to true on your AVAssetWriterInput.

https://developer.apple.com/library/tvos/documentation/AVFoundation/Reference/AVAssetWriterInput_Class/index.html#//apple_ref/occ/instp/AVAssetWriterInput/expectsMediaDataInRealTime

Tim Bull
  • 2,375
  • 21
  • 25
  • I am actually not using an AVAssetWriterInput. Im not quite sure if it would help me in this case but I may be wrong. How would you use it for blending the images later on? – ferdyyy Apr 26 '16 at 07:07
  • AVAssetWriterInput will also vend CMSampleBuffers, but when it comes to processing them, it can be really "chunky" if expectsMediaDataInRealTime isn't set to true as it optimises in that case for processing off disk instead in realtime. It's just a thought - I have seen output videos hitching a lot where I'm using a real time input but forgot to set that option. – Tim Bull Apr 26 '16 at 15:49