0

Let's say I'm using raymarching to render a field function. (This on the CPU, not the GPU.) I have an algorithm like this crudely-written pseudocode:

pixelColour = arbitrary;
pixelTransmittance = 1.0;
t = 0;
while (t < max_view_distance) {
  point = rayStart + t*rayDirection;
  emission, opacity = sampleFieldAt(point);
  pixelColour, pixelTransmittance =
      integrate(pixelColour, pixelTransmittance, emission, absorption);
  t = t * stepFactor;
}
return pixelColour;

The logic is all really simple... but how does integrate() work?

Each sample actually represents a volume in my field, not a point, even though the sample is taken at a point; therefore the effect on the final pixel colour will vary according to the size of the volume.

I don't know how to do this. I've had a look around, but while I've found lots of code which does it (usually on Shadertoy), it all does it differently and I can't find any explanations of why. How does this work, and more importantly, what magic search terms will let me look it up on Google?

David Given
  • 13,277
  • 9
  • 76
  • 123
  • 1
    ...after spending quite some time doing research on the web, I got really excited for a moment when I found what looked like a really promising article on the subject. Turned out it was this question. Bah. – David Given May 27 '14 at 21:29

1 Answers1

0

It's the Beer-Lambert law, which governs extinction through participating homogenous media. No wonder I was unable to find any keywords which worked.

There's a good writeup here, which tells me almost everything I need to know, although it does rather gloss over the calculation of the phase functions. But at least now I know what to read up on.

David Given
  • 13,277
  • 9
  • 76
  • 123