1

Hey I am trying to run a live feed on my device. Now I want to capture a photo every 3 seconds, but every-time it does it. It makes a shutter sound. This is bad UX.

Hence I want to run a live camera stream from the front camera and capture the frame at certain duration(~3 sec).

How can I extract a frame from the live camera feed and store it in a UIImage variable?

Thanks and Cheers!

Aakash Dave
  • 866
  • 1
  • 15
  • 30

2 Answers2

5

I understand your problem to the full extent. Few days back, I was also facing this problem, that's why I have developed a complete solution from showing live camera preview, arranging it properly on the view to getting camera frames continuously and converting the frames into UIImages efficiently without memory leak to utilise it accordingly. Kindly utilise the solution according to your need. The solution is optimised for swift 4.2 and developed on Xcode 10.0.

THIS IS THE LINK OF GITHUB REPO FOR THIS :- https://github.com/anand2nigam/CameraFrameExtractor

Kindly use your iPhone or iPad to test the application because it will not work on simulator. Please let me know about the app functionality and its working and if any help needed, do contact me. Hope it can solve your problem. Happy Learning.

Anand Nigam
  • 128
  • 3
  • 9
  • Your code doesn't actually include a way to get the frame – Fraser Oct 31 '18 at 22:41
  • Ok. Can you @Fraser be more specific on this? I am taking the frames from the camera and then converting it into UIImage so that it can be used for other things. I'll be very thankful if you'll explain the problem and help me improve it too. – Anand Nigam Nov 02 '18 at 11:34
  • My apologies, Id converted your example to inherit from NSObject so I could use it with an existing ViewController, but there was a bug at my end. It's all working great - thanks @anand2nigam – Fraser Nov 05 '18 at 01:47
  • Unfortunately I can't modify my vote. If you make a minor edit to your answer I'll be able to – Fraser Nov 05 '18 at 01:51
  • I am glad @Fraser that you find the solution. Its ok but please do spread the answer so that any other person will take use of it and wouldn't have to wander for the solution. – Anand Nigam Nov 08 '18 at 08:39
  • Hey @Fraser, do upvote the answer or mark it as accepted, it helped you so that it can help others in the same way. And also I am new here so your this gesture will be appreciated. Thanks. – Anand Nigam Feb 01 '19 at 15:48
  • Hi, I can't change my vote until you edit your answer – Fraser Feb 04 '19 at 03:02
  • @Fraser I have edited it. I think you can change your vote now. Thanks. – Anand Nigam Feb 04 '19 at 08:52
  • @Fraser, Thank you so much for being so generous, but haven't received any votes on the answer, does it take time to display? – Anand Nigam Feb 06 '19 at 06:05
  • Haha, you were on -1. My vote brings you back to zero :) – Fraser Feb 07 '19 at 03:52
  • @AnandNigam When I use the uiImage created in captureOutput() and do myImageView.image = uiImage, it shows the image rotated 90 counter clockwise, how should I fix this? – RufusV Mar 01 '20 at 18:33
2

When you're writing your own custom camera, you'll use AVAssetReader and AVAssetWriter in conjunction with the AVCaptureVideoDataOutputSampleBufferDelegate to vend and process CMSampleBuffers.

Using this approach, you can easily get a CIImage and process that with filters etc.

You basically want a timer that triggers every three seconds, lets the delegate block know to capture and process a frame, do what you want with it and just keep discarding the rest of them (or writing them to video instead if that's what you want).

This question here Recording videos with real-time filters in Swift contains the sample code you're looking for. Instead of writing the buffer out, instead capture it as an image.

Good luck!

Tim Bull
  • 2,375
  • 21
  • 25