I want to record the full macOS screen, and do different things with the images, like
- calculate average colours for different regions
- re-render parts of the screen blurred in an
NSPanel
orNSWindow
Right now I am using AVCaptureScreenInput
like this
let input = AVCaptureScreenInput(displayID: CGMainDisplayID())!
avCaptureSession.addInput(input)
let output = AVCaptureVideoDataOutput()
avCaptureSession.addOutput(output)
output.setSampleBufferDelegate(self, queue: .main)
self.avCaptureSession.startRunning()
The problem with this approach is that the framerate of the image seems to be pretty low, I would like to achieve 30fps+.
What is a good approach to achieve these goals on macOS. On one hand the "fast" fetching of the image, on the other hand, the realtime processing / re-rendering of the captured image.
Issue #1
The above solution with the capture session leads to unacceptable lag. This means when moving a window, the response rendering is delayed by more than 100ms. I would like to have that in real-time.