15

I'm trying to implement a QRCode scanner with the new iOS 7 features but my code isn't calling the main AVCaptureMetadataOutputObjectsDelegate method.

I've used the AVFoundation camera before and with my current implementation I've got the preview layer running without a problem. Even switching my output back to AVCaptureVideoDataOutput validates my session setup.

I'm using this NSHipster post as a guideline and here's my code so far:

Interface:

@import AVFoundation;

@interface QRCodeViewController () <AVCaptureMetadataOutputObjectsDelegate>

@property (strong, nonatomic) AVCaptureDevice* device;
@property (strong, nonatomic) AVCaptureDeviceInput* input;
@property (strong, nonatomic) AVCaptureMetadataOutput* output;
@property (strong, nonatomic) AVCaptureSession* session;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer* preview;

@end

Setup:

- (void)setupCamera
{
    // Device
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Input
    self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];

    // Output
    self.output = [[AVCaptureMetadataOutput alloc] init];
    [self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];

    // Session
    self.session = [[AVCaptureSession alloc] init];
    [self.session addInput:self.input];
    [self.session addOutput:self.output];

    // Preview
    self.preview = [AVCaptureVideoPreviewLayer layerWithSession:self.session];
    self.preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.preview.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
    [self.view.layer insertSublayer:self.preview atIndex:0];

    // Start
    [self.session startRunning];
}

Delegate Method:

// DELEGATE METHOD NOT CALLED
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"Metadata");
}

Any help is greatly appreciated!

Ricardo RendonCepeda
  • 3,271
  • 4
  • 23
  • 30

2 Answers2

36

I am trying to figure this out myself. The documentation seems to say that if you don't set the type(s) you want in metadataObjectTypes, you won't get any calls. But my iPad Mini's back camera returns an empty array for availableMetadataObjectTypes. Let us know what you figure out.

Edit:

I just figured out that if you add the AVCaptureMetadataOutput object to the session, the availableMetadataObjectTypes gets filled in and you can add the barcode detector to it. Then the captureOutput delegate will get called, so like this:

AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
output.metadataObjectTypes = @[AVMetadataObjectTypeQRCode];
Ned Zepplin
  • 501
  • 4
  • 9
  • 1
    Bravo sir! Extra kudos for being such a quick self-learner. I anticipate many developers will find your answer very useful once they get more involved with the new iOS 7 features. – Ricardo RendonCepeda Sep 25 '13 at 04:03
  • It made sense once I thought about it - the metadata obviously can't be known until the metadata object gets associated with the device. – Ned Zepplin Sep 25 '13 at 15:53
  • 4
    I put my little sample project on github if anyone is interested: https://github.com/kpmiller/ios7-barcode – Ned Zepplin Sep 25 '13 at 18:51
  • @NedZepplin Have you successfully recognized any barcodes with it? I tried to recognize QR and PDF417 on my iPad 3 with no luck (I used appropriate metadata). – Shmidt Sep 28 '13 at 19:30
  • @NedZepplin: How can I just capture one still image of the QRCode and do my processing? – Pria Sep 30 '13 at 11:04
  • I have not recognized QR codes. My application does UPC and UPC-E, and I can confirm that this does work. As far as "one still image" my application does video, and when I get the first recognized barcode from it, I close down my capture session and pop the view controller. I do video because it seems like autofocusing on the barcode area takes a second or two. If you just took one still I don't know if it would be focused. – Ned Zepplin Oct 02 '13 at 22:59
  • I just want to add that you need to make sure you you add the types after adding the output to the captureSession. It's also advisable to check to see whether the device supports the type you want by looping through and checking the availableMetadataObjectTypes array for the AVCaptureMetadataOutput object. – Gavin Williams Nov 27 '13 at 01:22
  • Can a bardcode be scanned on a iPad Mini's front camera or is it limited to the back camera only. – OneGuyInDc Feb 09 '14 at 02:14
  • Let me undestand one thing, is the availableMetadataObjectTypes always linked to the device? That means that the iOS simulator will always return an empty array, right? – Lucia Belardinelli Apr 17 '15 at 09:47
1

iOS 10 caused the same problem for me. I currently have the developer edition released at WWDC 2016. When I ran the app on a phone with iOS 9 the captureOutput:didOutputMetadataObjects: method was called again.

Stan James
  • 2,535
  • 1
  • 28
  • 35