1

I'm trying to follow some code provided by Apple to retrieve an image mask from portrait mode photos using some new classes and objects introduced in iOS 12. The code is here:

https://developer.apple.com/documentation/avfoundation/avportraiteffectsmatte/extracting_portrait_effects_matte_image_data_from_a_photo

func portraitEffectsMatteImageAt(_ path: String) -> UIImage? {
    let bundlePath = Bundle.main.bundlePath

    // Check that the image at given path contains auxiliary PEM data:
    guard let fileURL = NSURL(fileURLWithPath: bundlePath).appendingPathComponent(path),
    let source = CGImageSourceCreateWithURL(fileURL as CFURL, nil),
    let auxiliaryInfoDict = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypePortraitEffectsMatte) as? [AnyHashable: Any],
    let matteData = try? AVPortraitEffectsMatte(fromDictionaryRepresentation: auxiliaryInfoDict),
    let matteCIImage = CIImage(portaitEffectsMatte: matteData)
    else {
        return nil
    }
    return UIImage(ciImage: matteCIImage)
}

My only change is basically modifying the fileURL to use a jpg in my bundle. :

guard let fileURL = Bundle.main.url(forResource: "custom00", withExtension: "jpg")

However, stepping through the code allows me to see that the assignment of auxiliaryInfoDict is nil. I imported these JPGs from a previous project that utilized older techniques to create depth masks (https://www.raywenderlich.com/314-image-depth-maps-tutorial-for-ios-getting-started) so the jpg files should be fine.

Does anyone have a working sample project? Thanks

user339946
  • 5,961
  • 9
  • 52
  • 97

2 Answers2

2

You can load the portrait effects matte saved in a photo only if that photo has a portrait effects matte saved in it. That sounds like a tautology, so let me expand it:

  • If you captured the image yourself using AVCapturePhotoOutput, you get a portrait effects matte if and only if:

    • It's available/supported on the current capture device and configuration. Portrait effects require depth capture, so you need to have selected the back dual camera or the front TrueDepth camera (on a device so equipped) and enabled depth delivery.

    • You request it. Set isPortraitEffectsMatteDeliveryEnabled in your photo settings before capture.

    • The device can generate one. Portrait effects mattes come from a machine learning model trained to recognize human features. If there's no identifiable person in your photo, you don't get a matte. (Sorry, pet lovers.)

    • You don't opt out of saving it. You can turn off embedsPortraitEffectsMatteInPhoto, or use AVCapturePhotoFileDataRepresentationCustomizer to replace/remove a photo's matte (or other elements) after capture and before saving. Obviously, if you want the matte, don't get rid of it.

  • The same goes for images saved by any third-party app that uses the camera capture APIs. (That is, you can read mattes from images saved by other apps if they followed the above steps, just the same as you would if you were trying to capture an image with a matte.) See Configuring Camera Capture to Collect a Portrait Effects Matte.

  • If you captured a photo using Apple's built-in Camera app, it needs to be a Portrait Mode photo (back dual camera or front TrueDepth camera) captured on iOS 12.

rickster
  • 124,678
  • 26
  • 272
  • 326
0

It looks like this API works for front camera photos, however, does not yet work for rear camera photos.

user339946
  • 5,961
  • 9
  • 52
  • 97