1

As the title says, I am having some trouble with AVAssetWriter and memory.

Some notes about my environment/requirements:

  • I am NOT using ARC, but if there is a way to simply use it and get it all working I'm all for it. My attempts have not made any difference though. And the environment I will be using this in requires memory to be minimised / released ASAP.
  • Objective-C is a requirement
  • Memory usage must be as low as possible, the 300mb it takes up now is unstable when testing on my device (iPhone X).

The code

This is the code used when taking the screenshots below https://gist.github.com/jontelang/8f01b895321d761cbb8cda9d7a5be3bd

The problem / items kept around in memory

Most of the things that seem to take up a lot of memory throughout the processing seems to be allocated in the beginning.

enter image description here

So at this point it doesn't seem to me that the issues are with my code. The code that I personally have control over seems to not be an issue, namely loading the images, creating the buffer, releasing it all seems to not be where the memory has a problem. For example if I mark in Instruments the majority of the time after the one above, the memory is stable and none of the memory is kept around.

enter image description here

The only reason for the persistent 5mb is that it is deallocated just after the marking period ends.

Now what?

I actually started writing this question with the focus being on wether my code was releasing things correctly or not, but now it seems like that is fine. So what are my options now?

  • Is there something I can configure within the current code to make the memory requirements smaller?
  • Is there simply something wrong with my setup of the writer/input?
  • Do I need to use a totally different way of making a video to be able to make this work?

A note on using CVPixelBufferPool

In the documentation of CVPixelBufferCreate Apple states:

If you need to create and release a number of pixel buffers, you should instead use a pixel buffer pool (see CVPixelBufferPool) for efficient reuse of pixel buffer memory.

I have tried with this as well, but I saw no changes in the memory usage. Changing the attributes for the pool didn't seem to have any effect as well, so there is a small possibility that I am not actually using it 100% properly, although from comparing to code online it seems like I am, at least. And the output file works.

The code for that, is here https://gist.github.com/jontelang/41a702d831afd9f9ceeb0f9f5365de03

And here is a slightly different version where I set up the pool in a slightly different way https://gist.github.com/jontelang/c0351337bd496a6c7e0c94293adf881f.

Update 1

So I looked a bit deeper into a trace, to figure out when/where the majority of the allocations are coming from. Here is an annotated image of that:

enter image description here

The takeaway is:

  1. The space is not allocated "with" the AVAssetWriter
  2. The 500mb that is held until the end is allocated within 500ms after the processing starts
  3. It seems that it is done internally in AVAssetWriter

I have the .trace file uploaded here: https://www.dropbox.com/sh/f3tf0gw8gamu924/AAACrAbleYzbyeoCbC9FQLR6a?dl=0

jontelang
  • 589
  • 4
  • 17

4 Answers4

1
  1. When creating Dispatch Queue, ensure you create a queue with Autorlease Pool. Replace DISPATCH_QUEUE_SERIAL with DISPATCH_QUEUE_SERIAL_WITH_AUTORELEASE_POOL.

  2. Wrap each iteration of for loop into autorelease pool as well

like this:

[assetWriterInput requestMediaDataWhenReadyOnQueue:recordingQueue usingBlock:^{
        for (int i = 1; i < 200; ++i) {
            @autoreleasepool {
                while (![assetWriterInput isReadyForMoreMediaData]) {
                    [NSThread sleepForTimeInterval:0.01]; 
                }
                NSString *path = [NSString stringWithFormat:@"/Users/jontelang/Desktop/SnapperVideoDump/frames/frame_%i.jpg", i];
                UIImage *image = [UIImage imageWithContentsOfFile:path];
                CGImageRef ref = [image CGImage];
                CVPixelBufferRef buffer = [self pixelBufferFromCGImage:ref pool:writerAdaptor.pixelBufferPool];
                CMTime presentTime = CMTimeAdd(CMTimeMake(i, 60), CMTimeMake(1, 60));
                [writerAdaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
                CVPixelBufferRelease(buffer);
            }
        }
        [assetWriterInput markAsFinished];
        [assetWriter finishWritingWithCompletionHandler:^{}];
    }];
Eugene Dudnyk
  • 5,553
  • 1
  • 23
  • 48
  • Thank you for the reply, unfortunately I did not see any different results by doing these changes. – jontelang Apr 04 '21 at 15:47
  • For good measure I also tried adjusting the autorelease frequency of the queue with DISPATCH_AUTORELEASE_FREQUENCY_WORK_ITEM. No change. – jontelang Apr 04 '21 at 15:55
  • In that case, it would be the best to check the memory graph in Xcode and see who holds the reference to those objects – Eugene Dudnyk Apr 04 '21 at 15:56
  • I am not sure the issue is within the for loop, as in the second screenshot the memory allocated within it seems released in a timely fashion. I will check in the memory graph though, I had forgotten about it as well. Thanks. – jontelang Apr 04 '21 at 15:57
  • Throwing some ideas here, look at the memory size from zero to 300 megabytes, you might need to release some of those processes inbetween? OR create a video for specific work by using assetwriter, and write another video then merge together. I had some issues with creating assetwriter with mutable tracks so I thought about this sometime ago. AssetWriter need to be a single instance so need to be linear. – Meep Apr 04 '21 at 16:19
  • @Meep, the problem is that the 300mb are allocated basically at the setup of the AVAsset-things and within the first couple of few frames. And I can’t release any of that. I’ve found that most of the things are held by VTCompressionSession (I think it was). So I other have to go even lower level if I can’t find a config to send into the AVAssetWriter. Or, as you suggest, batch the creation into multiple videos and hope that works too. Seems non optimal though. – jontelang Apr 05 '21 at 05:47
  • @jontelang I just ran the app I made - the memory is around 72.5 megabytes total. I set the breakpoint after the init of AssetWriter and it is around 37.8 metabytes total in app. It is written in Swift although. I'm thinking it might be something this assetwriter can't be fixed, is AssetWriter 300 metabtyes allocated? or is it the total memory of app itself? It's odd to see AssetWriter having 300 mb. There is another option. Use AVMutableComposition - to stitch the tracks together but that is a wayaround solution. – Meep Apr 05 '21 at 11:49
  • @Meep is that 72mb during/throughout the whole process? How many images are you using to create the video? Is it AVAssetWriter too? I updated the question with an annotated image of where the data is being allocated. Will look into AVMutableComposition too, thanks. – jontelang Apr 05 '21 at 15:42
0

No, I see it is around 240 mb peaking in app. It's my first time using this allocation - interesting.

allocation assetwriter allocation

I'm using AssetWriter to write a video file by streaming cmSampleBuffer : CMSampleBuffer. It gets from AVCaptureVideoDataOutputSampleBufferDelegate by Camera CaptureOutput Realtime.

Meep
  • 501
  • 6
  • 24
0

While I have not yet found the actual issue, the memory problem I described in this question was solved by simply doing it on the actual device instead of the simulator.

jontelang
  • 589
  • 4
  • 17
0

@Eugene_Dudnyk Answer is on spot, the auto release pool INSIDE the for or while loop is the key, here is how I got it working for Swift, also, please use AVAssetWriterInputPixelBufferAdaptor for pixel buffer pool:

videoInput.requestMediaDataWhenReady(on: videoInputQueue) { [weak self] in
        while videoInput.isReadyForMoreMediaData {
            autoreleasepool {
                guard let sample = assetReaderVideoOutput.copyNextSampleBuffer(),
                      let buffer = CMSampleBufferGetImageBuffer(sample) else {
                          print("Error while processing video frames")
                          videoInput.markAsFinished()
                          DispatchQueue.main.async {
                              videoFinished = true
                              closeWriter()
                          }
                          return
                      }

                // Process image and render back to buffer (in place operation, where ciProcessedImage is your processed new image)
                self?.getCIContext().render(ciProcessedImage, to: buffer)
                let timeStamp = CMSampleBufferGetPresentationTimeStamp(sample)
                self?.adapter?.append(buffer, withPresentationTime: timeStamp)
            }
        }
    }

My memory usage stopped rising.

Juan Boero
  • 6,281
  • 1
  • 44
  • 62