13

I am following this tutorial and tried to convert codes form Swift 2.0 to 3.0. But when I launched the application, the app doesn't work! I mean, nothing happens! Here is my code:

ViewController:

class ViewController: UIViewController ,BarcodeDelegate {

    override func prepare(for segue: UIStoryboardSegue, sender: Any?) {

        let barcodeViewController: BarcodeViewController = segue.destination as! BarcodeViewController
        barcodeViewController.delegate = self

    }



    func barcodeReaded(barcode: String) {
        codeTextView.text = barcode
        print(barcode)
    }

}

BarcodeVC:

import AVFoundation


protocol BarcodeDelegate {

    func barcodeReaded(barcode: String)
}

class BarcodeViewController: UIViewController,AVCaptureMetadataOutputObjectsDelegate {

    var delegate: BarcodeDelegate?
    var captureSession: AVCaptureSession!
    var code: String?


    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view.
        print("works")

        self.captureSession = AVCaptureSession();
        let videoCaptureDevice: AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

        do {

            let videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)

            if self.captureSession.canAddInput(videoInput) {
                self.captureSession.addInput(videoInput)
            } else {
                print("Could not add video input")
            }

            let metadataOutput = AVCaptureMetadataOutput()
            if self.captureSession.canAddOutput(metadataOutput) {
                self.captureSession.addOutput(metadataOutput)

                metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
                metadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode,AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypePDF417Code]
            } else {
                print("Could not add metadata output")
            }

            let previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
            previewLayer?.frame = self.view.layer.bounds
            self.view.layer .addSublayer(previewLayer!)
            self.captureSession.startRunning()
        } catch let error as NSError {
            print("Error while creating vide input device: \(error.localizedDescription)")
        }



    }



    //I THINK THIS METHOD NOT CALL !
    private func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {

        // This is the delegate'smethod that is called when a code is readed
        for metadata in metadataObjects {
            let readableObject = metadata as! AVMetadataMachineReadableCodeObject
            let code = readableObject.stringValue

            // If the code is not empty the code is ready and we call out delegate to pass the code.
            if  code!.isEmpty {
                print("is empty")

            }else {

                self.captureSession.stopRunning()
                self.dismiss(animated: true, completion: nil)
                self.delegate?.barcodeReaded(barcode: code!)


            }
        }

    }

Here is the output:

2016-09-17 18:10:26.000919 BarcodeScaning[2610:674253] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles 2016-09-17 18:10:26.007782 BarcodeScaning[2610:674253] [MC] Reading from public effective user settings.

Tamás Sengel
  • 55,884
  • 29
  • 169
  • 223
iOS.Lover
  • 5,923
  • 21
  • 90
  • 162

5 Answers5

22

The first step needs to be declare access to any user private data types that is a new requirement in iOS 10. You can do it by adding a usage key to your app’s Info.plist together with a purpose string.

Because if you are using one of the following frameworks and fail to declare the usage your app will crash when it first makes the access:

Contacts, Calendar, Reminders, Photos, Bluetooth Sharing, Microphone, Camera, Location, Health, HomeKit, Media Library, Motion, CallKit, Speech Recognition, SiriKit, TV Provider.

To avoid the crash you need to add the suggested key to Info.plist:

enter image description here

And then the system shows the purpose string when asking the user to allow access:

enter image description here

For more information about it you can use this article:

I have done a little modifications to your BarcodeViewController to make it work properly as you can see below:

BarcodeViewController

import UIKit
import AVFoundation

protocol BarcodeDelegate {
   func barcodeReaded(barcode: String)
}

class BarcodeViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {

   var delegate: BarcodeDelegate?

   var videoCaptureDevice: AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
   var device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
   var output = AVCaptureMetadataOutput()
   var previewLayer: AVCaptureVideoPreviewLayer?

   var captureSession = AVCaptureSession()
   var code: String?

   override func viewDidLoad() {
      super.viewDidLoad()

      self.view.backgroundColor = UIColor.clear
      self.setupCamera()
   }

   private func setupCamera() {

      let input = try? AVCaptureDeviceInput(device: videoCaptureDevice)

      if self.captureSession.canAddInput(input) {
          self.captureSession.addInput(input)
      }

      self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

      if let videoPreviewLayer = self.previewLayer {
          videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
          videoPreviewLayer.frame = self.view.bounds
          view.layer.addSublayer(videoPreviewLayer)
      }

      let metadataOutput = AVCaptureMetadataOutput()
      if self.captureSession.canAddOutput(metadataOutput) {
          self.captureSession.addOutput(metadataOutput)

          metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
          metadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]
      } else {
          print("Could not add metadata output")
      }
   }

   override func viewWillAppear(_ animated: Bool) {
       super.viewWillAppear(animated)

       if (captureSession.isRunning == false) {
          captureSession.startRunning();
       }
   }

   override func viewWillDisappear(_ animated: Bool) {
      super.viewWillDisappear(animated)

      if (captureSession.isRunning == true) {
         captureSession.stopRunning();
      }
   }

   func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
       // This is the delegate's method that is called when a code is read
       for metadata in metadataObjects {
           let readableObject = metadata as! AVMetadataMachineReadableCodeObject
           let code = readableObject.stringValue


           self.dismiss(animated: true, completion: nil)
           self.delegate?.barcodeReaded(barcode: code!)
           print(code!)
       }
   }
}

One of the important points was to declare the global variables and start and stop the captureSession inside the viewWillAppear(:) and viewWillDisappear(:) methods. In your previous code I think it was not called at all as it never enter inside the method to process the barcode.

I hope this help you.

Bhaumik
  • 1,218
  • 1
  • 11
  • 20
Victor Sigler
  • 23,243
  • 14
  • 88
  • 105
  • I will award the bounty in 1H (due to limitations ) – iOS.Lover Sep 21 '16 at 07:32
  • @Mc.Lover No problem, :) – Victor Sigler Sep 21 '16 at 14:45
  • Do you know how to specify the region to scan (for example only a finite rectangle? – Lê Khánh Vinh Nov 24 '16 at 19:13
  • Hi @VictorSigler. May I know where did you find the documentation for the implementation of barcode Scanner? I just want to understand each line of your code, thanks a lot. – Pepeng Hapon Mar 03 '17 at 06:36
  • 2
    Perfect, but was not filling the screen, so that i added `videoPreviewLayer.frame = self.view.bounds` again to `viewDidLayoutSubviews()` to fill the view because i have the Viewcontroller set from nib. and it is working perfectly now. Thank you – MBH Apr 18 '17 at 10:18
  • This works perfectly. But when I get the result and the QR code is still in the camera bound, its scans the same QR Code continuously in background. I need a one time scan. How it is possible? – Akhil Nair Apr 28 '17 at 06:35
  • hiii @VictorSigler, i used ur code it is working fine and shows scanned qrcode but only when user allows permission for using camera but if user not allows permission for using camera then app crashes could u help me with this and update ur code – Dilip Tiwari Nov 06 '17 at 12:09
  • @LêKhánhVinh look on my answer here: https://stackoverflow.com/a/49408133/3840884 There you can find how to set specify region to scan – lukszar Mar 21 '18 at 13:54
  • Adding a late comment...but @Victor Sigler how can we add a frame to the camera for the barcode. As of now, just a plain camera is shown...so the user cannot know where to focus the barcode.. –  May 11 '18 at 14:50
  • @VictorSigler Can you help me? I also want to get barcode image as UIimage with code how can I do this? – Nikunj Kumbhani Feb 23 '19 at 05:28
7

Here is Victor Sigler's answer updated to Swift 4 without force unwrapping, a weak protocol, executing expensive code in the background thread and other refinements.

Notice that AVCaptureMetadataOutputObjectsDelegate's method changed from

captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)

to

metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection)

import UIKit
import AVFoundation

protocol BarcodeDelegate: class {
    func barcodeRead(barcode: String)
}

class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
    weak var delegate: BarcodeDelegate?

    var output = AVCaptureMetadataOutput()
    var previewLayer: AVCaptureVideoPreviewLayer!

    var captureSession = AVCaptureSession()

    override func viewDidLoad() {
        super.viewDidLoad()

        setupCamera()
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        DispatchQueue.global(qos: .userInitiated).async {
            if !self.captureSession.isRunning {
                self.captureSession.startRunning()
            }
        }
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        DispatchQueue.global(qos: .userInitiated).async {
            if self.captureSession.isRunning {
                self.captureSession.stopRunning()
            }
        }
    }

    fileprivate func setupCamera() {
        guard let device = AVCaptureDevice.default(for: .video),
            let input = try? AVCaptureDeviceInput(device: device) else {
            return
        }

        DispatchQueue.global(qos: .userInitiated).async {
            if self.captureSession.canAddInput(input) {
                self.captureSession.addInput(input)
            }

            let metadataOutput = AVCaptureMetadataOutput()

            if self.captureSession.canAddOutput(metadataOutput) {
                self.captureSession.addOutput(metadataOutput)

                metadataOutput.setMetadataObjectsDelegate(self, queue: .global(qos: .userInitiated))

                if Set([.qr, .ean13]).isSubset(of: metadataOutput.availableMetadataObjectTypes) {
                    metadataOutput.metadataObjectTypes = [.qr, .ean13]
                }
            } else {
                print("Could not add metadata output")
            }

            self.previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
            self.previewLayer.videoGravity = .resizeAspectFill

            DispatchQueue.main.async {
                self.previewLayer.frame = self.view.bounds
                self.view.layer.addSublayer(self.previewLayer)
            }
        }
    }

    func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
        // This is the delegate's method that is called when a code is read
        for metadata in metadataObjects {
            if let readableObject = metadata as? AVMetadataMachineReadableCodeObject,
                let code = readableObject.stringValue {
                dismiss(animated: true)
                delegate?.barcodeRead(barcode: code)
                print(code)
            }
        }
    }
}
Tamás Sengel
  • 55,884
  • 29
  • 169
  • 223
4

Barcode Scanner in Swift 4 for all code types

Below I would like to share with few ideas according to barcode scanning in iOS.

  • separate barcode scanner logic from View logic,
  • add entry in .plist file
  • set exposurePointOfInterest and focusPointOfInterest
  • set rectOfInterests with proper converted CGRect
  • set focusMode and exposureMode
  • lock captureDevice with lockForConfiguration properly while change camera capture settings

Add entry in .plist file
In Info.plist file add following code to allow your application to access iPhone's camera:

<key>NSCameraUsageDescription</key>
<string>Allow access to camera</string>

Set exposurePointOfInterest and focusPointOfInterest
exposurePointOfInterest and focusPointOfInterest allow to better quality of scanning, faster focusing camera on central point of screen.

Set rectOfInterests
This property let camera to focus just on a part of the screen. This way code can be scanned faster, focused just on codes presented in a center of the screen - what is useful while few other codes are available in background.

Set focusMode and exposureMode Properties should be set like following:

device.focusMode = .continuousAutoFocus
device.exposureMode = .continuousAutoExposure

This allow to continuously focus and set exposure well adjusted to scanning code.

Demo

Here you can find ready project implementing this idea: https://github.com/lukszar/QuickScanner

lukszar
  • 1,252
  • 10
  • 13
0

You need to add NSCameraUsageDescription to your Info.plist file in order to get it to work!

Simply add a row in info.plist and type NSCameraUsageDescription into the newly created row and add a string meant for informing the user on why access to the camera is needed within your app.

This should do the trick!

JTing
  • 651
  • 7
  • 17
  • I added ! Still not working , camera opens but no qr or bar codes detecting – iOS.Lover Sep 19 '16 at 13:26
  • Yep, got the same problem now! First I just converted the full project provided to Swift 3 syntax and added NSCameraUsageDescription - Worked like a charm. Now after following the tutorial I'm stuck at the same problem... Not sure yet what the deal is, but I'll keep looking. – JTing Sep 19 '16 at 15:25
0
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
  print("caught QR code")
  for metadata in metadataObjects {
     let readableObject = metadata as! AVMetadataMachineReadableCodeObject
     let code = readableObject.stringValue
     if  code!.isEmpty {
        print("is empty")
     } else {
        self.captureSession.stopRunning()
        self.dismiss(animated: true, completion: nil)
        self.delegate?.gotQRCode(code: code!)
     }
  }
}

Looks like the method's signature changed a bit in Swift 3. Here is the correct version

Meet Doshi
  • 4,241
  • 10
  • 40
  • 81
Mikhail Baynov
  • 121
  • 1
  • 3