3

I want to record the full macOS screen, and do different things with the images, like

  • calculate average colours for different regions
  • re-render parts of the screen blurred in an NSPanel or NSWindow

Right now I am using AVCaptureScreenInput like this

let input = AVCaptureScreenInput(displayID: CGMainDisplayID())!
avCaptureSession.addInput(input)
let output = AVCaptureVideoDataOutput()
avCaptureSession.addOutput(output)
output.setSampleBufferDelegate(self, queue: .main)
self.avCaptureSession.startRunning()

The problem with this approach is that the framerate of the image seems to be pretty low, I would like to achieve 30fps+.

What is a good approach to achieve these goals on macOS. On one hand the "fast" fetching of the image, on the other hand, the realtime processing / re-rendering of the captured image.

Issue #1

The above solution with the capture session leads to unacceptable lag. This means when moving a window, the response rendering is delayed by more than 100ms. I would like to have that in real-time.

Martin Mlostek
  • 2,755
  • 1
  • 28
  • 57

1 Answers1

0

AVCaptureScreenInput is post window-composition, so you'll be capturing and transforming your own effect, which is probably not what you want.

The closest thing to what you probably want is the CGWindowListCreateImage API, which allows you to exclude windows, including your own, however it's not an ideal "realtime" interface because the CGImage output is not timestamped and cannot be toll-free bridged to a texture.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159