I'm building an app that provides an editable canvas similarly to photoshop or illustrator (currently using SpriteKit
). I'm encountering some performance issues when displaying a large number of nodes (1000+) as you might imagine.
Currently, I've got an SKScene
with SKShapeNodes
on it. They presently do not have any images and are simply filled with a color. Their shape paths (CGPath
s) vary from circles, to bezier paths. Each shape currently has an SKPhysicsBody
with the same path as the rendered shape that is used to detect taps.
The performance issues can be described by:
- slowness when adding 1000 nodes (circles), uses about 0.1mb of memory per node
- slowness when moving 1000 nodes (circles)
- slowness when generating a texture from 1000 nodes (circles)
Disabling PhysicsBodies doesn't substantially improve performance, but does improve CPU load (jumps from constant 60% to 1% or so)
Most users will not be working with 1000 nodes, but I'd like to implement the optimal solution.
What I'd like is two have two layers:
- A render layer on which I'd like to be able to render
CGPath
s with strokes and fills (preferably choosing the stroke end cap style among other little things) - An interaction layer on which I'd like to be able to detect taps inside
CGPath
s and stroke CGPath's with a color to indicate highlighting.
How can I accomplish this or a similar solution that will improve the speed at which I can render 1000 circles?