0

I am beginning an Spherical Harmonics shader project for an iOS app I am writing. I have begun by reading this excellent in-depth paper on the subject (PDF) - http://bit.ly/aQmax3.

The paper describes a scene pre-processing step that involves ray-casting. Can someone describe how ray-casting can be performed using GLSL on iOS?

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
dugla
  • 12,774
  • 26
  • 88
  • 136

1 Answers1

2

If you're referring to the ray casting used in that process in determining which surfaces are hit by ambient light (for the ambient occlusion shading), I've done that within my Molecules iOS application. I talk a little about the process in this blog post and this paper I submitted for the Nanotech conference, but I can expand upon that.

In my case, I use depth testing to determine the areas on a surface hit by ambient light at various orientations. I take my model and render it at a series of orientations that correspond to someone looking at it from an evenly distributed set of points on a sphere surrounding the object. For each point on the object's surface, I determine whether or not that point is visible from that orientation by testing its transformed Z position against the depth calculated for the visible surface at that point.

I track this visibility over the many orientations of the object through the use of an ambient occlusion intensity texture. I use a custom shader to map between locations on this texture and positions on the surface of my object. If the position is visible in that rotation, I write out a grey value (with a value of 1 / (number of orientations)), and if hidden I write out black. I then use an additive blend mode to accumulate these values in order to determine which surfaces were hit the most times by ambient light.

My ambient occlusion textures look something like the following:

Ambient occlusion texture

In this case, I was mapping a series of spheres as part of a molecular model, so each rectangle in that texture is the surface area of a sphere, flattened using a mapping function. When wrapped around the actual model, these values look like this:

Mapped ambient occlusion values

Also, for determining the depth values of my object at various orientations, I had to create my own depth writing function that calculates per-pixel depth values and stores those into a custom depth texture. I then read from this texture when determining whether ambient light hits a point in an orientation or not. If I remember correctly, you can't read from the depth buffer directly on iOS devices, so you may need to do something similar for your model, as well.

This doesn't cover the reflected diffuse lighting case, but it at least describes the means by which I did my ambient occlusion shading. The source code for this application is available at the above link, if you wish to dig into it and see how all this works in practice.

Brad Larson
  • 170,088
  • 45
  • 397
  • 571
  • Whoa. Very impressive. Thanks for this Brad. I'll dig into the paper and I'ms sure I'll have further questions. Cheers. – dugla Apr 27 '12 at 22:18
  • Brad, regarding the ambient occlusion pass. As I understand it you use a square screen aligned card corresponding to the screen-space bounding-box of each spherical atom, correct? You map sphere normals to the square then do a "ray-cast" through the hemisphere of directions above each sphere, correct? How long does this take for, say, the molecule shown above. It sounds like this would work for 3D polygonal meshes as well. – dugla Apr 27 '12 at 23:49
  • The calculations depend on the number of spheres and cylinders in the model, but the above model takes ~2 seconds to render the entire ambient occlusion pass on an iPhone 4 (much faster on 4S, iPad 2 and 3), which consists of 22 different orientations. A single orientation therefore takes approximately one tenth of a second to run this particular shader. I could see how this could be extended to other object types, but the particular shader I use is tuned to these sphere and cylinder impostors. – Brad Larson Apr 28 '12 at 00:14