4

I'm working on an iPad app that displays lightmapped scenes. Loading the 20 or so 1Kx1K textures that are involved is taking a while, and when I started timing the various operations I found it was taking slightly less than 1/2 second per texture.

It turns out that loading a texture image from the filesystem is pretty fast, and that the bottleneck is in copying the UIImage to a CGContext in order to pass the image to a glTexImage2D()

I've tried two different ways of making the copy:

    CGContextSetInterpolationQuality(textureCopyContext, kCGInterpolationNone);
    CGContextDrawImage( textureCopyContext, CGRectMake( 0, 0, width, height ), image);

and

    UIGraphicsPushContext(textureCopyContext) ;
    [uiImage drawInRect:CGRectMake(0, 0, width, height)] ;
    UIGraphicsPopContext() ;

and both take about 0.45 seconds. This strikes me as excessive, even for a relatively underpowered device.

I'm relatively new to iOS development, so I just want to ask whether the times I'm seeing are reasonable, or whether they can be improved.

Update: I'm aware of the PVRTC alternative, but for now I've got to stick with PNGs. However, there is an excellent summary of the pros and cons of PVRTC in this answer. The same answer also hints at why PNGs result in such long texture setup times -- "internal pixel reordering". Can anybody confirm this?

Community
  • 1
  • 1
brainjam
  • 18,863
  • 8
  • 57
  • 82

2 Answers2

1

Switching texture context has traditionally been expensive, dating back to desktops (it's alot faster on modern GPUs). You could try using texture atlas, depending how big your textures are this is the most efficient approach. A texture Atlas is putting the textures together, I believe the iPad is able to load 2048 by 2048 textures, you could squash 4 textures together.

The other alternative, use pvrtc texture compression, you can reduce file size by about 25% depending on quality. The PowerVR chip stores it on device using this compression so it saves time AND bandwidth copying. This can look lossy on occasion with lower settings, but for 3d textures it is the best option, whereas 2d sprites prefer the first option.

Let me know if you need more clarification. Don't forget that png files are compressed when loaded from file system and expanded into the full-size bit buffer which is ALOT bigger.

Mitchell Currie
  • 2,769
  • 3
  • 20
  • 26
  • What you say about switching texture context is true, but has nothing to do with my question, which is about texture setup. But +1 for mentioning PVRTC, although at the moment I can't take advantage of it (see my update). – brainjam Aug 10 '11 at 02:45
  • I have a solution for you, but can't discuss GLKit as it's under NDA, if you are an iOS dev member, check it out in the iOs5 SDK. It's a one line function to load a texture from a file into a context.... Got it. Why don't you run it through Instruments and examine if you are doubling up on allocation to CoreGraphics and openGL, or try examine the call time. Using CoreGraphics contexts to load PNG images is expensive, If you're actually copying from the image into the textureCopyContext then there's two buffers, where you only want one. – Mitchell Currie Aug 10 '11 at 04:08
  • you're right, shouldn't be talking about stuff under NDA. As a dev member I can look it up. When this stuff is no longer under NDA, I suggest you update your answer. – brainjam Aug 10 '11 at 18:37
  • I've finally gotten around to trying GLKTextureLoader. It takes 96% of the time of the traditional method, which is hardly a blazing improvement. And the traditional method still allows other things such as re-scaling textures. The comparison was done with mipmapped textures. Nevertheless, if you are doing vanilla stuff, GLKTextureLoader requires way less code. – brainjam Apr 12 '12 at 22:11
0

I'd start by taking a look at how it's done in https://github.com/cocos2d/cocos2d-iphone/blob/develop/cocos2d/CCTexture2D.m. I hope there's something in there that helps. There they're doing glTexImage2D straight from the image data, no CGContext involved.

Graham Perks
  • 23,007
  • 8
  • 61
  • 83
  • Thanks, but it looks like they are still using the same method to get from a UIIMage* to a CGContext. See line https://github.com/cocos2d/cocos2d-iphone/blob/develop/cocos2d/CCTexture2D.m#L328 – brainjam Aug 10 '11 at 02:29