0

I'm trying to do some front camera face tracking with Metal / Scenekit as the renderer (yes, reinventing snapchat)

I'm using Firebase for the face tracking

I have an issue whereby setting the AVCaptureConnection videoOrientation causes there to be no faces detected

public func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    do {
        var textures: [MTLTexture]!
        delegate?.metalCameraSession(self, didSendRawSampleBufferForFaceDetection: sampleBuffer)
        connection.videoOrientation = .portrait//portrait causes no faces to be detected
        connection.isVideoMirrored = true
        guard let imageBuffer = sampleBuffer.imageBuffer else {
            throw MetalCameraSessionError.failedToGetImageBuffer
        }
        switch pixelFormat {
        case .rgb:
            let textureRGB = try texture(sampleBuffer: imageBuffer, textureCache: textureCache)
            textures = [textureRGB]
        case .yCbCr:
            let textureY = try texture(sampleBuffer: imageBuffer, textureCache: textureCache, planeIndex: 0, pixelFormat: .r8Unorm)
            let textureCbCr = try texture(sampleBuffer: imageBuffer, textureCache: textureCache, planeIndex: 1, pixelFormat: .rg8Unorm)
            textures = [textureY, textureCbCr]
        }

        let timestamp = try self.timestamp(sampleBuffer: sampleBuffer)

        delegate?.metalCameraSession(self, didReceiveFrameAsTextures: textures, withTimestamp: timestamp, andImageBuffer :sampleBuffer)

    }
    catch let error as MetalCameraSessionError {
        self.handleError(error)
    }
    catch {
        /**
         * We only throw `MetalCameraSessionError` errors.
         */
    }
}

If I do not set it to portrait all is fine but my MTLTexture is rotated and stretched

Textures are created as follows :

private func texture(sampleBuffer: CVImageBuffer?, textureCache: CVMetalTextureCache?, planeIndex: Int = 0, pixelFormat: MTLPixelFormat = .bgra8Unorm) throws -> MTLTexture {
    guard let sampleBuffer = sampleBuffer else {
        throw MetalCameraSessionError.missingSampleBuffer
    }
    guard let textureCache = textureCache else {
        throw MetalCameraSessionError.failedToCreateTextureCache
    }


    let isPlanar = CVPixelBufferIsPlanar(sampleBuffer)
    let width = isPlanar ? CVPixelBufferGetWidthOfPlane(sampleBuffer, planeIndex) : CVPixelBufferGetWidth(sampleBuffer)
    let height = isPlanar ? CVPixelBufferGetHeightOfPlane(sampleBuffer, planeIndex) : CVPixelBufferGetHeight(sampleBuffer)

    var imageTexture: CVMetalTexture?

    let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, sampleBuffer, nil, pixelFormat, width, height, planeIndex, &imageTexture)

    guard
        let unwrappedImageTexture = imageTexture,
        let texture = CVMetalTextureGetTexture(unwrappedImageTexture),
        result == kCVReturnSuccess
        else {
            throw MetalCameraSessionError.failedToCreateTextureFromImage
    }

    return texture
}

This remains the case (no face tracking) even if I send the sample buffer to the face detector prior to changing the videoOrientation and creating the metal texture

Wings
  • 2,398
  • 23
  • 46
  • I'm not familiar with MTLTexture, but have you tried ML Kit's Quick Start app which shows how to do face detection in a video stream using the front camera: https://github.com/firebase/quickstart-ios/tree/master/mlvision. In particular, this class: https://github.com/firebase/quickstart-ios/blob/master/mlvision/MLVisionExample/CameraViewController.swift – Dong Chen Jun 19 '18 at 17:38
  • Hi Doug - yes I have thank you - that works as long as the AVCaptureConnection is not set to portrait. Once that change is made MLKit no longer detects faces regardless of the orientation enum I set in the face detector - – Liam Walsh Jun 20 '18 at 10:00

0 Answers0