I am having trouble firing the flash in an iOS app using the camera to take pictures.
Here is the function I call when I want to make the flash work:
func litFlash() {
if !captureDevice.hasFlash || !captureDevice.isFlashModeSupported(.On) {return}
// We now have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On)
do {try captureDevice.lockForConfiguration()
} catch let error as NSError {
print("captureDevice.lockForConfiguration FAILED")
print(error.code)
}
captureDevice.torchMode = .On
//captureDevice.flashMode = .On // Strangely enough, this produces a crash.
captureDevice.unlockForConfiguration()
//captureSession.commitConfiguration()
}
This is where I call the function:
if fireFlashFlag {litFlash()}
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
………
It works for about 20%, for the rest I get a flash but it is not synchronised with the picture taking. The flash appear a fraction of a second too early.
Hoping someone can point me into the right direction. Thanks in advance for any relevant piece of information.