0

Hello I'm working in an app that is recording video + audio. The Video source is the camera, and the audio is coming from streaming. My problem happen when the communication with streaming is closed for some reason. Then in that case I switch the audio source to built in mic. The problem is the audio is not synchronised at all. I would like to add a space in my audio and then set the timestamp in realtime according to the current video timestamp. Seems AvassetWritter is adding the frames consecutive from built in mic and it looks like is ignoring the timestamp.

Do you know why avassetwriter is ignoring the timestamp?

EDIT:

This is the code than gets the latest video timestamp

- (void)renderVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
    CVPixelBufferRef renderedPixelBuffer = NULL;
    CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
    self.lastVideoTimestamp = timestamp;

and this is the code that I use to synchronise audio coming from built in mic when the stream is disconnected.

CFRelease(sampleBuffer);
sampleBuffer = [self adjustTime:sampleBuffer by:self.lastVideoTimestamp];

//Adjust CMSampleBufferFunction

- (CMSampleBufferRef) adjustTime:(CMSampleBufferRef) sample by:(CMTime) offset
{
    CMItemCount count;
    CMSampleBufferGetSampleTimingInfoArray(sample, 0, nil, &count);
    CMSampleTimingInfo* pInfo = malloc(sizeof(CMSampleTimingInfo) * count);
    CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);
    for (CMItemCount i = 0; i < count; i++)
    {
        pInfo[i].decodeTimeStamp = kCMTimeInvalid;//CMTimeSubtract(pInfo[i].decodeTimeStamp, offset);
        pInfo[i].presentationTimeStamp = CMTimeSubtract(pInfo[i].presentationTimeStamp, offset);
    }
    CMSampleBufferRef sout;
    CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);
    free(pInfo);
    return sout;
}

That is what I want to do.

Video
--------------------------------------------------------------------
Stream                 disconnect stream           Built in mic
-----------------------------------                       -----------------

I would like to get this, as you can see there is a space with no audio, because the audio coming from the stream is disconnected and maybe you didn't receive all of the audio.

What it is currently doing:

 Video
--------------------------------------------------------------------
Stream                 disconnect stream           Built in mic
--------------------------------------------------------------------
Pablo Martinez
  • 1,573
  • 12
  • 31
  • Can you show some code? At least the bit that shows "setting the timestamp in realtime according to the current video timestamp"? Video & audio have different timing mechanisms. Maybe that's where your problem lies. – Rhythmic Fistman Feb 22 '16 at 02:22
  • @RhythmicFistman Thanks for reply, I have share some of the code that I use to synchronise both frames (audio + video) – Pablo Martinez Feb 22 '16 at 05:28
  • Are you calling `[AVAssetWriter startSessionAtSourceTime:]` and if so, with what? – Rhythmic Fistman Feb 22 '16 at 06:05
  • @RhythmicFistman Yes of course, and that is working. I'm going to explain you an example: I start recording video from an ipad, and audio from the stream. I synchronise audio + video, but the problem is if you disconnect the stream in the middle of recording, so I have to switch audio coming from the stream to built in mic. Then the problem is the audio coming from built in mic is not setting the timestamp correctly. It is not sync – Pablo Martinez Feb 22 '16 at 06:12
  • @RhythmicFistman I have edited the post to show you what I'm trying to do. Thanks! – Pablo Martinez Feb 22 '16 at 06:25
  • @PabloMartinez: Have you find an answer/solution to this? Is the AVAssetWriterInput for audio ignoring the timestamps? I'm having the same issue... – Mihai Jun 09 '16 at 10:07
  • 1
    @MihaiGhete I have ended adding empty cmsamplebuffers to add that space. Do you want me to share this solution? – Pablo Martinez Jun 09 '16 at 10:15
  • @PabloMartinez: My problem is more complex and adding empty CMSampleBuffers won't help. I'm using a kAudioUnitSubType_VoiceProcessingIO unit. Everything works fine until I start decoding data for the output. Audio and video goes out of sync after that (because I'm saving the video locally). But thank you for the suggestion! – Mihai Jun 09 '16 at 10:35
  • @MihaiGhete I'm wondering how did you endup? :) – Pablo Martinez Jun 29 '18 at 15:56

0 Answers0