0

I am using CADisplayLink to live filter an image and showing it in the MTKView. All filters work fine until I try the blur filter - during that filter MTKView sometimes starts strobing, glitching or just showing a black screen on some of the frames instead of the actual result image.

I have three interesting observations:

1) There is no such problem when I display the result image in the UIImageView, so the filter itself is not the cause of the problem

2) If I switch filter back from blur to any other filter, the same problem starts happening in those filters too, but ONLY when I used the blur filter first

3) The glitching itself is slowly fading away the more I use the app. It even starts to occur less and less the more times I actually launch the app.

Code for the MTKView:

import GLKit
import UIKit
import MetalKit
import QuartzCore

class MetalImageView: MTKView
{
    let colorSpace = CGColorSpaceCreateDeviceRGB()

    lazy var commandQueue: MTLCommandQueue =
    {
        [unowned self] in

        return self.device!.makeCommandQueue()!
    }()

    lazy var ciContext: CIContext =
    {
        [unowned self] in

        return CIContext(mtlDevice: self.device!)
    }()

    override init(frame frameRect: CGRect, device: MTLDevice?)
    {
        super.init(frame: frameRect,
            device: device ?? MTLCreateSystemDefaultDevice())

        if super.device == nil
        {
            fatalError("Device doesn't support Metal")
        }

        framebufferOnly = false
    }

    required init(coder: NSCoder)
    {
        fatalError("init(coder:) has not been implemented")
    }

    // from tutorial
    private func setup() {
        framebufferOnly = false
        isPaused = false
        enableSetNeedsDisplay = false
    }

    /// The image to display
    var image: CIImage?
    {
        didSet
        {

        }
    }

    override func draw()
    {
        guard let
            image = image,
            let targetTexture = currentDrawable?.texture else
        {
            return
        }


        let commandBuffer = commandQueue.makeCommandBuffer()

        let bounds = CGRect(origin: CGPoint.zero, size: drawableSize)

        let originX = image.extent.origin.x
        let originY = image.extent.origin.y

        let scaleX = drawableSize.width / image.extent.width
        let scaleY = drawableSize.height / image.extent.height
        let scale = min(scaleX, scaleY)

        let scaledImage = image
            .transformed(by: CGAffineTransform(translationX: -originX, y: -originY))
            .transformed(by: CGAffineTransform(scaleX: scale, y: scale))

        ciContext.render(scaledImage,
            to: targetTexture,
            commandBuffer: commandBuffer,
            bounds: bounds,
            colorSpace: colorSpace)

        commandBuffer!.present(currentDrawable!)
        commandBuffer!.commit()

        super.draw()

    }
}


extension CGRect
{
    func aspectFitInRect(target: CGRect) -> CGRect
    {
        let scale: CGFloat =
        {
            let scale = target.width / self.width

            return self.height * scale <= target.height ?
                scale :
                target.height / self.height
        }()

        let width = self.width * scale
        let height = self.height * scale
        let x = target.midX - width / 2
        let y = target.midY - height / 2

        return CGRect(x: x,
            y: y,
            width: width,
            height: height)
    }
}

The code for the blur filter in Metal:

float4 zoneBlur(sampler src, float time, float4 touch) {

        float focusPower = 2.0;
        int focusDetail = 10;

        float2 uv = src.coord();
        float2 fingerPos;
        float2 size = src.size();


        if (touch.x == 0 || touch.y == 0) {
            fingerPos = float2(0.5, 0.5);

        } else {
            fingerPos = touch.xy / size.xy;
        }


        float2 focus = uv - fingerPos;

        float4 outColor;
        outColor = float4(0, 0, 0, 1);


        for (int i=0; i < focusDetail; i++) {
            float power = 1.0 - focusPower * (1.0/size.x) * float(i);
            outColor.rgb += src.sample(focus * power + fingerPos).rgb;
        }

        outColor.rgb *= 1.0 / float(focusDetail);
        return outColor;
    }

I wonder what can cause such an odd behaviour?

SmartTree
  • 1,381
  • 3
  • 21
  • 40
  • It sounds like something is off with the timing of your display refreshes and/or the synchronization between updating your image and drawing. What are you using the `CADisplayLink` for? – Frank Rupprecht Jul 20 '19 at 09:19
  • @FrankSchlegel With the `CADisplayLink` I pass a time variable each frame so the filter has an animation. I am setting up the `CADisplayLink` in a view controller – SmartTree Jul 21 '19 at 05:19
  • Can you move the application of the filter into the `draw` method and just set the `time` (which is not used in the kernel, btw) property of the filter in the display link callback? I guess due to the concurrency between `draw` and display link, the image you are trying to draw is sometimes no longer valid since you change the pipeline in the display link callback. – Frank Rupprecht Jul 21 '19 at 07:30
  • It is likely that your blur implementation is using up too much GPU time and causing glitches when the job cannot complete before the next refresh. Writing an effective blur is very difficult, so my suggestion would be to refactor your approach to make use of a built in Metal supplied blur in MPS. Use Metal directly, avoid using CoreImage in your render pipeline. – MoDJ Aug 01 '19 at 16:18

0 Answers0