0

the last few days i was reading a lot articles about post-processing with bloom etc. and i was able to implement a render to texture functionality with this texture running through a sperate shader. Now i have some questions regarding the whole thing.

  1. Do i have to render both? The Scene and the Texture put on a full-screen quad?
  2. How does Bloom, or any other Post-Processing (DOF, Blur) with this render to texture functionality work? Or is this something completly different?
  3. I dont really understand the concept of the Back and Front-Buffer and how to make use of this for post processing.
  4. I have read something about the volumetric light rendering where they render the scene like 6 times with different color settings. Isnt this quite inefficient? Or was my understanding there just incorrect?

Thanks for anyone care to explain this things to me ;)

puelo
  • 5,464
  • 2
  • 34
  • 62

2 Answers2

2

Let me try to answer some of your questions

  1. Yes, you have to render both
  2. DOF is typically implemented by rendering a "blurriness" factor into an offscreen buffer, where a post-processing filter then uses this factor to blur certain pixels more than others (with some compensation for color-leaking between sharp and blurred objects). So yes, the basic idea is the same, render to a buffer, process it and then display it (with or without blending it on top of the original scene).
  3. The back buffer is what you render stuff to (what the user will see on the next frame). All offscreen rendering is done to other rendertargets that you will create and use.
  4. I don't quite understand what you mean. Please provide a link to what you read so I can try to understand and perhaps explain it.
Ani
  • 10,826
  • 3
  • 27
  • 46
  • i think i have explained that wrong. They used 6 different render targets (for example 1 for completely unlit scene). Can you just use different render targets without actually rendering to a texture? Just process it immediately? EDIT: And i have tried to just render the texture alone and nothing changed..but i guess this is mainly because im not doing very much with it, isnt it? – puelo Jul 02 '12 at 20:53
  • If you mean "can you use different render targets without rendering to the back buffer", then yes. The idea is the DX always renders to the currently attached render target(s). If you change it and its not the back buffer, so be it. It's upto you to render the final image on the back buffer before calling **Present**. In fact, if you have an application that only wants to crunch numbers there is no need to even render an image to the back buffer or present anything. – Ani Jul 02 '12 at 20:56
  • Can i use the different render targets in a shader without actually rendering them to a texture or do i have to this every time i want to work with a shader pass. So for example i render the scene 1 or multiple times (maybe with different light settings) to different render targets, edit them in a shader or something and then i just render the scene and all the edited parts to the back buffer? – puelo Jul 02 '12 at 21:05
  • You always have to have some render target attached, I believe. Otherwise what are you rendering TO? A shader is a program that works on one input and transforms it to an output, so you you need both to make that work. – Ani Jul 02 '12 at 21:21
  • I think you misunderstood me. I was asking if you can work with just the rendertarget in the shader without actually rendering this to a texture. Maybe i am just confusing some parts here. I appreciate your help. – puelo Jul 02 '12 at 21:29
  • No. "Running a shader" == "performing rendering" and that needs a rendertarget attached and some geometry to render. – Ani Jul 02 '12 at 21:41
  • "I was asking if you can work with just the rendertarget in the shader without actually rendering this to a texture. " I think the point of confusion is that the rendertarget IS a texture. If we are speaking with DirectX terms, a Texture is a block of special GPU memory, a RenderTarget is a "write" (as in "render to it from a shader") access to that Texture, and a ShaderResourceView is a "read" (as in "use it to sample from in a shader") access to a texture, and the backbuffer is the Texture that will be displayed on screen when you call Present. – Gerasimos R Apr 30 '13 at 20:01
0

Suppose that:

  1. you have the "luminance" for each renderer pixel in a single texture

  2. this texture hold floating point values that can be greater that 1.0

Now:

You do a blur pass (possibly a separate blur), only considering pixels with a value greater than 1.0, and put the blur result in another texture.

Finally:

In a last shader you do the final presentation to screen. You sample from both the "luminance" (clamped to 1.0) and the "blurred excess luminance" and add them, obtaining the so-called bloom effect.

Gigi
  • 4,953
  • 24
  • 25
  • So i am rendering the luminance of the scene in a texture. I do a blur pass over this texture where i consider only pixels with a value greater than 1.0 and save this texture after the blur pass in another texture. At the end i add those 2 textures together and render them over the actual scene? – puelo Jul 02 '12 at 20:57
  • You can "add them together" using a custom shader (remeber to clamp luminance values to 1.0 when reading them in the final shader) – Gigi Jul 02 '12 at 21:16