2

I've written my custom camera. I get a pixelBuffer from AVCaptureVideoDataOutput delegate, create MTLTexture from it and display using Metal. I've got ~ 20% CPU usage. Apple's public project AVCamFilter built on the same principle have the same CPU usage.

! But AVCaptureVideoPreviewLayer and other AVFoundation classes like AVCaptureMovieFileOutput have 0 CPU usage, how?

! And how can I achieve this result ?

  • Well, pretty obviously, if you just show the built-in capture video preview layer, you are not also getting a pixel buffer on every frame and passing it thru some sort of processing. – matt Feb 08 '21 at 22:39
  • @matt Yes, but how Apple achieve this result? – Igor Sorokin Feb 09 '21 at 07:48
  • But without any custom processing, there should be no difference between what `AVCaptureVideoPreviewLayer` does and rendering the buffer using Metal, right? – Frank Rupprecht Feb 09 '21 at 07:50
  • @FrankSchlegel I don't know how works `AVCaptureVideoPreviewLayer` under the hood. But with Metal I create MTLTexture and calculate vertices, as the result ~ 10% CPU usage. – Igor Sorokin Feb 09 '21 at 12:36

0 Answers0