For an object recognition app with CoreML and Vision I downloaded a sample app from Apple (https://developer.apple.com/documentation/vision/recognizing_objects_in_live_capture). Unfortunately this app only seems to work in portrait mode but I need to use it landscape.
In standard configuration my objects will get detected but the camera preview layer is rotated 90°. After I rotate the preview layer my objects still get recognized but not as accurate as before. This bugs me and is not acceptable for me nor my client.
I tried to rotate the preview layer like this:
previewLayer.connection?.videoOrientation = .landscapeRight
And tried to rotate the video output like this:
videoDataOutput.connection(with: .video)?.videoOrientation = .landscapeLeft
After the rotation the bounding boxes are not positioned correctly. Moving the camera result in even weirder movements of the boxes. It seems to have something to do with following function:
public func exifOrientationFromDeviceOrientation() -> CGImagePropertyOrientation {
let curDeviceOrientation = UIDevice.current.orientation
let exifOrientation: CGImagePropertyOrientation
switch curDeviceOrientation {
case UIDeviceOrientation.portraitUpsideDown: // Device oriented vertically, home button on the top
exifOrientation = .left
case UIDeviceOrientation.landscapeLeft: // Device oriented horizontally, home button on the right
exifOrientation = .upMirrored
case UIDeviceOrientation.landscapeRight: // Device oriented horizontally, home button on the left
exifOrientation = .down
case UIDeviceOrientation.portrait: // Device oriented vertically, home button on the bottom
exifOrientation = .up
default:
exifOrientation = .up
}
return exifOrientation
}
If I change
case UIDeviceOrientation.landscapeLeft:
exifOrientation = .upMirrored
to
case UIDeviceOrientation.landscapeLeft:
exifOrientation = .left
it will position the bounding boxes correctly and the camera movement seems to work fine as well. But the recognition of the objects suffers big time.