6

UPDATE

I got around CG's limitations by drawing everything with OpenGL. Still some glitches, but so far it's working much, much faster.

Some interesting points :

  • GLKView : That's an iOS-specific view, and it helps a lot in setting up the OpenGL context and rendering loop. If you're not on iOS, I'm afraid you're on your own.
  • Shader precision : The precision of shader variables in the current version of OpenGL ES (2.0) is 16-bit. That was a little low for my purposes, so I emulated 32-bit arithmetics with pairs of 16-bit variables.
  • GL_LINES : OpenGL ES can natively draw simple lines. Not very well (no joints, no caps, see the purple/grey line on the top of the screenshot below), but to improve that you'll have to write a custom shader, convert each line into a triangle strip and pray that it works! (supposedly that's how browsers do that when they tell you that Canvas2D is GPU-accelerated)

                                                      Example rendering

  • Draw as little as possible. I suppose that makes sense, but you can frequently avoid rendering things that are, for instance, outside of the viewport.
  • OpenGL ES has no support for filled polygons, so you have to tesselate them yourself. Consider using iPhone-GLU : that's a port of the MESA code and it's pretty good, although it's a little hard to use (no standard Objective-C interface).

Original Question

I'm trying to draw lots of CGPaths (typically more than 1000) in the drawRect method of my scroll view, which is refreshed when the user pans with his finger. I have the same application in JavaScript for the browser, and I'm trying to port it to an iOS native app.

The iOS test code is (with 100 line operations, path being a pre-made CGMutablePathRef) :

- (void) drawRect:(CGRect)rect {
    // Start the timer
    BSInitClass(@"Renderer");
    BSStartTimedOp(@"Rendering");

    // Get the context
    CGContextRef context = UIGraphicsGetCurrentContext();
    CGContextSetLineWidth(context, 2.0);
    CGContextSetFillColorWithColor(context, [[UIColor redColor] CGColor]);
    CGContextSetStrokeColorWithColor(context, [[UIColor blueColor] CGColor]);
    CGContextTranslateCTM(context, 800, 800);

    // Draw the points
    CGContextAddPath(context, path);
    CGContextStrokePath(context);

    // Display the elapsed time
    BSEndTimedOp(@"Rendering");
}

In JavaScript, for reference, the code is (with 10000 line operations) :

window.onload = function() {
  canvas = document.getElementById("test");
  ctx = canvas.getContext("2d");

  // Prepare the points before drawing
  var data = [];
  for (var i = 0; i < 100; i++) data.push ({x: Math.random()*canvas.width, y: Math.random()*canvas.height});

  // Draw those points, and write the elapsed time
  var __start = new Date().getTime();
  for (var i = 0; i < 100; i++) {
    for (var j = 0; j < data.length; j++) {
      var d = data[j];
      if (j == 0) ctx.moveTo (d.x, d.y);
      else ctx.lineTo(d.x,d.y)
    }
  }
  ctx.stroke();
  document.write ("Finished in " + (new Date().getTime() - __start) + "ms");
};

Now, I'm much more proficient in optimizing JavaScript than I am at iOS, but, after some profiling, it seems that CGPath's overhead is absolutely, incredibly bad compared to JavaScript. Both snippets run at about the same speed on a real iOS device, and the JavaScript code has 100x the number of line operations of the Quartz2D code!

EDIT: Here is the top of the time profiler in Instruments :

Running Time   Self             Symbol Name
6487.0ms       77.8%  6487.0    aa_render
449.0ms        5.3%   449.0     aa_intersection_event
112.0ms        1.3%   112.0     CGSColorMaskCopyARGB8888
73.0ms         0.8%   73.0      objc::DenseMap<objc_object*, unsigned long, true, objc::DenseMapInfo<objc_object*>, objc::DenseMapInfo<unsigned long> >::LookupBucketFor(objc_object* const&, std::pair<objc_object*, unsigned long>*&) const
69.0ms         0.8%   69.0      CGSFillDRAM8by1
66.0ms         0.7%   66.0      ml_set_interrupts_enabled
46.0ms         0.5%   46.0      objc_msgSend
42.0ms         0.5%   42.0      floor
29.0ms         0.3%   29.0      aa_ael_insert

It is my understanding that this should be much faster on iOS, simply because the code is native... So, do you know :

  • ...what I am doing wrong here?
  • ...and if there's another, better solution to draw that many lines in real-time?

Thanks a lot!

Community
  • 1
  • 1
F.X.
  • 6,809
  • 3
  • 49
  • 71
  • What is instruments telling you? – cweinberger Jul 12 '12 at 20:30
  • Just added the output to the post. About 80% of the time is spent in `aa_render`... – F.X. Jul 12 '12 at 20:35
  • 1
    Try drawing it in a non scrollview to see how slow it is. UIScrollView sometimes makes drawing very slow. If that's the case, you might be able to do your drawing in a plain UIView and then place that UIView into the scrollview. Also, make sure the view is opaque if possible. – EricS Jul 12 '12 at 20:48
  • 1
    You say it's the drawRect method of your scroll view, and it's refreshed when the user pans. Does that mean you're calling setNeedsDisplay each time the touch moves? – rob mayoff Jul 12 '12 at 20:49
  • @rob: Yes, the implementation is at a very early stage. I know that's not something that's very good performance-wise, but it's the simplest way to update my view, and it should at least perform as good, or better than JavaScript, not that much slower. – F.X. Jul 12 '12 at 20:56
  • 1
    You're drawing the canvas once in JavaScript. You're trying to do it up to 60 times per second in Objective-C. The JavaScript canvas 2D context is almost certainly built on top of CGBitmapContext in Safari on Mac and iOS. – rob mayoff Jul 12 '12 at 21:27
  • @rob: I know, and this is why I'm baffled. When I was talking about the timings, I was talking about one frame. One frame with drawing 100 times the same path containing 100 points takes ~1.3s to render in ObjC and ~20ms in JavaScript. Perhaps the anti-aliasing or some features that are only present in ObjC? – F.X. Jul 12 '12 at 21:33
  • Rather then getting into iOS drawing I'll say put your javascript in a UIWebView and build your application like that – 2cupsOfTech Jul 31 '12 at 08:38
  • Oh, yeah, forgot to update the question. I switched to a direct OpenGL rendering solution. It took me one week to put it up and running, but now it works, even if there are some glitches. The point of porting it to iOS was having something _fast_, so a UIWebView is not an option... – F.X. Jul 31 '12 at 17:27
  • 1
    Care to share the GL code you created as a replacement? – Jody Hagins Aug 15 '12 at 19:33
  • @JodyHagins: It's a pretty big code now, but it's standard OpenGL ES as I'm using the `GL_LINE` primitive (for now). Are you interested in something in particular? – F.X. Aug 16 '12 at 09:15
  • OpenGL is about the only area where I have done literally nothing. I have written an app that does lots of drawing and was just interested in seeing what you came up with. – Jody Hagins Aug 16 '12 at 12:41
  • @JodyHagins: I'll update my question with some references to the main issues I encountered, that will probably interest you more than the actual code. You can find the basic stuff in any OpenGL-related website, but I don't think I can disclose the full code for the most advanced stuff. – F.X. Aug 16 '12 at 23:38

2 Answers2

0

As you described on question, using OpenGL is the right solution. Theoretically, you can emulate all kind of graphics drawing with OpenGL, but you need to implement all shape algorithm yourself. For example, you need to extend edge corners of lines yourself. There's no concept of lines in OpenGL. The line drawing is kind of utility feature, and almost used only for debugging. You should treat everything as a set of triangles.

I believe 16bit floats are enough for most drawings. If you're using coordinates with large numbers, consider dividing space into multiple sector to make coordinate numbers smaller. Floats' precision become bad when it's going to very large or very small.

Update

I think you will meet this issue soon if you try to display UIKit over OpenGL display. Unfortunately, I also couldn't find the solution yet.

Community
  • 1
  • 1
eonil
  • 83,476
  • 81
  • 317
  • 516
  • Accepting for the sake of posterity, OpenGL is the way to go in the future, but I don't know when that'll be for us, as priorities kinda shifted away from iOS for the time being :) – F.X. Oct 01 '12 at 19:13
0

You killed CGPath performance by using CGContextAddPath.

Apple explicitly says this will run slowly - if you want it to run fast, you are required to attach your CGPath objects to CAShapeLayer instances.

You're doing dynamic, runtime drawing - blocking all of Apple's performance optimizations. Try switching to CALayer - especially CAShapeLayer - and you should see performance improve by a large amount.

(NB: there are other performance bugs in CG rendering that might affect this use case, such as obscure default settings in CG/Quartz/CA, but ... you need to get rid of the bottleneck on CGContextAddPath first)

Adam
  • 32,900
  • 16
  • 126
  • 153