2

Thank you for taking the time to read my thread. Recently i’ve been developing a simple iOS free-draw app and have run in to an issue that’s currently above my skill level to solve. I’ve been scouring the internet for days to try and come up with a solution but have had no luck thus far. Fortunately I have thought of a remedy for my applications lag issue, however I still need help as really do not know how to implement it.

A brief description of how this part of the program operates:

As the user moves his/her finger across the screen ( in a UIView labelled: view_Draw), touchesBegan() and touchesMoved() interpret start and end points for x&y coordinates and store these coordinates in an array (lines). The drawView is then forced to update via setNeedsDisplay().

class view_Draw: UIView {

//  view_Draw Class Variables

var lastPoint: CGPoint!
var drawColor:UIColor = UIColor.redColor()

required init(coder aDecoder:NSCoder) {
super.init(coder: aDecoder)
self.backgroundColor = UIColor.whiteColor()

}

  override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {

      lastPoint = touches.anyObject()?.locationInView(self)
}


override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {

     var newPoint = touches.anyObject()?.locationInView(self)

     lines.append(Line(start: self.lastPoint, end: newPoint!, color: self.drawColor))
     self.lastPoint = newPoint
     setNeedsDisplay()

}

 override func drawRect(rect: CGRect) {

        var context = UIGraphicsGetCurrentContext()

        //set stroke style
        CGContextSetLineCap(context, kCGLineCapRound)

        //set brush parameters
        CGContextSetStrokeColorWithColor(context, drawColor.CGColor)
        CGContextSetAlpha(context, 0.95)
        CGContextSetLineWidth(context, brushSize)

        for line in lines {

            CGContextBeginPath(context)
            CGContextMoveToPoint(context, line.start.x, line.start.y)
            CGContextAddLineToPoint(context, line.end.x, line.end.y)
            CGContextStrokePath(context)
            //    CGBitmapContextCreateImage(context)
            //    CGBitmapContextReleaseDataCallback()

       }
     }
}

A brief description of the issue:

As the user continues draws on the screen, I notice in the instruments panel that the CPU on thread 1 reaches to around 95% – 100%. This causes elements in my program (timers, drawing response) to begin lagging.

Actions taken to remedy issue:

I’ve experimented by disabling setNeedsDisplay() and have discovered that filling the lines array equates to only 10% of the overall CPU demand. From what I understand, this is because nothing from drawRect is being applied to the coordinates within the lines array.

Disabling CGContextStrokePath() and enabling setNeedsDisplay() increases CPU demand to 49%. I've interpreted this as the coordinates within the lines array are now being manipulated by drawRect- however are not actually being drawn onto the view.

This means that by forcing setNeedsDisplay() to update with CGContextStrokePath enabled, it’s hogging roughly 85% – 90% of the available processing power of thread 1.

I’ve also experimented with adding a timer to control how often setNeedsDisplay forces an update, but the results are less than acceptable. Drawing feels choppy with this in place.

Proposed remedy:

I think that the principle issue is that setNeedsDisplay() is redrawing the entirety of the lines array- what the user has drawn, constantly while touchesMoved() is being accessed.

I have looked into potentially using GCD to try and take some load off of thread 1, however after reading up on it, it seems as though this would not be a 'safe' way. From what i've understood, GCD and/or dispatch_async... shouldn't be used for elements that directly interact with UI elements.

I’ve seen on various forums that people have tackled similar issues by converting the existing path context to a bitmap and only updating the newly generated path with setNeedsDisplay.

I’m hoping that by approaching the issue this way, the setNeedsDisplay will not have to draw the entire array live every time as the previously drawn lines will have been converted into a static image. I have run out of ideas on how to even start implementing this.

As you can probably tell, I started learning Swift only a few weeks ago. I am doing my best to learn and approach problems in an effective manner. If you have any suggestions on how I should proceed with this, it would be greatly appreciated. Again, thank you for your help.


Answer based upon Aky's Smooth Freehand Drawing on iOS tutorial

By implementing the following code, a "buffer" of sorts is created that helps lessen the load on thread 1. In my initial tests, the load topped out at 46% while drawing as opposed to my original programs load that would top out at 95%-100%

LinearInterpView.swift

import UIKit

class LinearInterpView:UIView {

    var path = UIBezierPath()                                                   //(3)
    var bezierPath = UIBezierPath()



    required init(coder aDecoder:NSCoder) {                                     //(1)

        super.init(coder: aDecoder)
        self.multipleTouchEnabled = false                                       //(2)
        self.backgroundColor = UIColor.whiteColor()
        path = bezierPath
        path.lineWidth = 40.0

    }


    override func drawRect(rect: CGRect) {                                      //(5)

        UIColor.blackColor().setStroke()
        path.stroke()


    }

    override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {

        var touch:UITouch = touches.anyObject() as UITouch
        var p:CGPoint = touch.locationInView(self)
        path.moveToPoint(p)

    }

   override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {

        var touch:UITouch = touches.anyObject() as UITouch
        var p:CGPoint = touch.locationInView(self)
        path.addLineToPoint(p)                                                  //(4)
        setNeedsDisplay()

    }

    override func touchesEnded(touches: NSSet, withEvent event: UIEvent) {

        touchesMoved(touches, withEvent: event)
    }

    override func touchesCancelled(touches: NSSet!, withEvent event: UIEvent!) {

        touchesEnded(touches, withEvent: event)
    }


}

CachedLIView.swift

import UIKit

class CachedLIView:UIView {


    var path = UIBezierPath()
    var bezierPath = UIBezierPath()                                             // needed to add this
    var incrementalImage = UIImage()                                            //(1)
    var firstRun:Bool = true



    required init(coder aDecoder:NSCoder) {

        super.init(coder: aDecoder)
        self.multipleTouchEnabled = false
        self.backgroundColor = UIColor.whiteColor()
        path = bezierPath
        path.lineWidth = 40.0


    }


    override func drawRect(rect: CGRect) {

        incrementalImage.drawInRect(rect)                                       //(3)
        path.stroke()


    }

    override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {

        var touch:UITouch = touches.anyObject() as UITouch
        var p:CGPoint = touch.locationInView(self)
        path.moveToPoint(p)

    }

    override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {

        var touch:UITouch = touches.anyObject() as UITouch
        var p:CGPoint = touch.locationInView(self)
        path.addLineToPoint(p)
        self.setNeedsDisplay()

    }

    override func touchesEnded(touches: NSSet, withEvent event: UIEvent) {      //(2)

        var touch:UITouch = touches.anyObject() as UITouch
        var p:CGPoint = touch.locationInView(self)
        path.addLineToPoint(p)
        self.drawBitmap()                                                            //(3)
        self.setNeedsDisplay()
        path.removeAllPoints()                                                  //(4)

    }

    override func touchesCancelled(touches: NSSet!, withEvent event: UIEvent!) {

        touchesEnded(touches, withEvent: event)
    }

    func drawBitmap() {                                                         //(3)

        var rectPath = UIBezierPath()

        UIGraphicsBeginImageContextWithOptions(self.bounds.size, true, 0.0)
        UIColor.blackColor().setStroke()


        if(firstRun == true) {

            rectPath = UIBezierPath(rect: self.bounds)
            UIColor.whiteColor().setFill()
            rectPath.fill()
            firstRun = false

        }

        incrementalImage.drawAtPoint(CGPointZero)
        path.stroke()
        incrementalImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

    }

}

Thank you again for your help, I hope this can be of use to others as well.

  • Hi, I was wondering if you can post the translated swift project to github? –  May 29 '16 at 20:49

1 Answers1

0

I wrote a couple of tutorials a few years ago about drawing in iOS using Objective-C. You might be able to translate this code into Swift without too many problems.

The first article talks about how you can render the on-screen graphics context into an image every time the user lifts his finger from the screen, so you can generate a new path from fresh from that point onwards and just draw it on top of that image.

Smooth Freehand Drawing on iOS

The second one has a GCD-based implementation that lets you do this rendering in the background.

Advanced Freehand Drawing Techniques

Aky
  • 1,777
  • 1
  • 14
  • 19
  • Thank you Aky, I will work towards converting this code into a Swift format and reply with my results later. This is a very promising lead and a very well written tutorial. – kazzicopter Feb 18 '15 at 14:41
  • I just did a search and it seems that someone has done a Swift implementation based on the code from the first tutorial: https://github.com/adamskyle/KADSwiftDrawing/tree/master/SwiftDrawing Don't know how well it works, but there you go. Good luck! – Aky Feb 18 '15 at 16:48
  • Aky, I've just completed converting your tutorial to a Swift format. I thought that you may like a copy so that you don't have to write it all over again. I'll post it on the original question as I do not have enough space to post it within the comment box. Thank you again, it works great and is a critical step in the right direction for my app. – kazzicopter Feb 18 '15 at 17:48