2

I am doing a particle system in WebGL using Three.js, and I want to do all the computation of the particles in the shaders. To achieve that, the positions (for example) of the particles are stored in a texture which is sampled by the vertex shader of each particle (POINT primitive).

The position texture is in fact two render targets which are swapped each frame after being updated off screen. Each pixel of this texture represent a particle. To update a position, I read one of he render targets (texture2D), do some computation, and write on the other render target (fragment output).

To perform the "do some computation" step, I need some per particle attributes, like its velocity (and a lot of others). Since this step is done in the fragment shader, I can't use the vertex attributes buffers, so I have to store these properties in separate textures and sample each of them in the fragment shader.

It works, but sampling textures is slow as far as I know, and I wonder if there is some better ways to do this, like having one vertex per particle, each rendering a single fragment of the position texture. I know that OpenGL 4 as some alternative ways to deal with this, like UBO or SSBO, but I'm not sure about WebGL.

genpfault
  • 51,148
  • 11
  • 85
  • 139
deck
  • 343
  • 2
  • 11
  • I think using textures is the way to go. Using 2 RGBA textures should give you 8 parameters to work with. – beiller May 15 '15 at 15:37
  • By the time a fragment has been generated, the position has already been processed (clipped, projected, and mapped to device coordinates). Thus if the position is to be altered, it has to be done in the vertex shader. – wcochran May 15 '15 at 18:52
  • I'm not sure of what you mean wcochran. I have the system with textures already working fine, and I didn't notice the problem you described yet. Can you explain how would you load, change and save the current position of the particle it in the vertex shader? – deck May 15 '15 at 19:08
  • you cant, you need a texture. I am a bit confused though. The way i understand it, you render the positions to the texture, read it in the particle vertex shader, and then compute velocity? – pailhead May 18 '15 at 20:43
  • No, to render the position I need to read the velocity property, which is stored in a texture (this is the thing I tried to improve by this question). – deck May 19 '15 at 00:29

0 Answers0