10

I'm trying to figure out how I can implement functionality to repeatedly pause and resume video capture in a single session, but have each new segment (the captured segments after each pause) added to the same video file, with AVFoundation. Currently, every time I press "stop" then "record" again, it just saves a new video file to my iphone's album and starts capturing to a separate/new file. I need to be able to press the "record/stop" button over and over... only capture video & audio when record is active... then when the "done" button is pressed, have a single AV file with all the segments together. And all this needs to happen in the same capture session / preview session.

the only way I can think of to try this is when the "done" button is pressed, taking each individual output file and combining them together into a single file... But im pretty sure the processing time to basically paste a bunch of separate clips together won't be acceptable. Plus, it just seems like this will be a really messy & unnecessary way to go about this, with way too much code.

Is there any simple way to just pause video capture within a single session and simply resume capture to the same file? Or any other ideas?

If it's not too much trouble, sample code would help me out a ton... I'm still learning & teaching myself, so I'm not great with following the lingo & terminology in explanations. Thanks

edit: this is the project I am starting with to learn AVFoundation... so this is the code I am looking to alter to achieve the above functionality: http://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html

Daniel McCarthy
  • 1,406
  • 2
  • 13
  • 19

2 Answers2

11

Instead of using the movie file output from the capture session, you could use AVCaptureVideoDataOutput and in your delegate, pass the samples to an instance of AVAssetWriterInput. Then you could decouple the previewing from the recording. If your delegate does not forward the buffers to the asset writer, there's no recording for that portion.

When you wanted to start recording a second (and subsequent) session to the same file, you would need to adjust the timestamps so that they are sequential from the point at which you stopped, and you'd need to ensure that you make the same adjustment to both audio and video so they stay in sync. So a little fiddly, but certainly workable.

Edit: there's a sample iPhone app that demonstrates this.

Vukašin Manojlović
  • 3,717
  • 3
  • 19
  • 31
Geraint Davies
  • 2,847
  • 15
  • 9
  • Thanks, Geraint, for the response. I'm still new to objective-c and trying to fumble my way around understanding what you said. I certainly understand the logic behind your answer, and it seems like it would probably work. Problem for me is figuring out how to actually implement. I've been digging around for examples/sample code for bits and pieces of what you suggested, but am having a hard time. I'm going to keep hunting & if I can get this to work, I'll immediately accept your answer. In the mean time, if you have any sample code or links to get me started, It would help me a lot! Thx – Daniel McCarthy Feb 25 '13 at 17:18
  • I have some code. I'll tidy it up into a publishable sample. Ping me if you don't see anything within a few days. – Geraint Davies Feb 26 '13 at 17:34
  • @Geraint Davies - I am looking for exactly the same thing. Can you please post your code here on github. I too cannot find any writeup or example on this. Thanks – Sam B Feb 26 '13 at 19:59
  • @GeraintDavies thanks a bunch, I cant wait to see it. I figured out an alternate means to the same end, ie through AVMutableComposition, recording separate files then tearing the frames apart & combining everything together at the end.... but processing time to compile & create the composition clip during run time is an absolute nightmare... Thanks again, & looking forward to your update :) – Daniel McCarthy Feb 26 '13 at 21:50
  • I've uploaded the sample and edited the text to include a link. Feel free to get in touch if you've any comments or issues. – Geraint Davies Feb 27 '13 at 17:57
  • @GeraintDavies You are the man!! I checked it out, and it throws some strange and random errors... but that is MORE than enough to get me started on where I need to be. If I figure out where the errors are coming from and a fix, I'll post back here as a comment Thank you again! – Daniel McCarthy Mar 09 '13 at 03:36
  • @GeraintDavies can you give me the sample code of this please – Gajendra Rawat Dec 24 '13 at 07:06
1

In general, I use startRunning and stopRunning if I am using AVCaptureSession. I am assuming you will.

See the documentation for AVCaptureSession.

Khaled Barazi
  • 8,681
  • 6
  • 42
  • 62
  • Wont stopRunning stop the entire session and end the live preview of what the camera will be capturing when actually recording? – Daniel McCarthy Feb 16 '13 at 07:15
  • Correct because when you instantiate an AVCaptureVideoPreviewLayer, you do that with a specific session. Were you looking to keep the preview layer going and be able to start capture/stop capture? similar to the video recorder? – Khaled Barazi Feb 16 '13 at 11:54
  • Yes, I need to keep the preview layer going. user needs to basically be able to point, capture for a few seconds, stop.... look at something else, point, capture, stop.... look at something else, point, shoot, stop.... etc, all in the same session. Then when done, have a single video file that just plays each capture in a single flowing file. I also edited my question a bit to hopefully be a littler more clear & make a bit more sense – Daniel McCarthy Feb 16 '13 at 12:30
  • Daniel can you please post the a sample code when you are done? I am looking do exactly the same thing and for life of me cannot find any sample code anywhere – Sam B Feb 25 '13 at 19:56
  • @SamB Did you get the solution? – Anurag Sharma Feb 28 '17 at 10:03