1

I need to render an enormous amount of terrain using WebGL, so I'm going for texture synthesis approach. Now I've been reading tons of articles about texture synthesis, but I didn't find anything really GPU ready. Some use GPU to render into off-screen buffer to get the synthesized texture, but I need it to be rendered in real time in pixel shader.

The question is: Is there any way to write a fragment shader that receives the following:

  • Sample texture
  • UV coords

And according to that renders arbitrary sized synthesized texture. Or do I actually need to synthesize it on CPU and pass to GPU as ready texture?

Peter O.
  • 32,158
  • 14
  • 82
  • 96
user1617735
  • 451
  • 5
  • 16
  • What technology are you using? OpenGL (GLSL), DirectX (HLSL), Unity, or something else? – Peter O. May 06 '17 at 05:16
  • The general technique, though, involves generating two-dimensional noise (like Perlin noise), based on a UV texture coordinate, and using the "fractal Brownian motion" technique. – Peter O. May 06 '17 at 05:20
  • @Peter O. I'm using WebGL, which is basically OpenGL ES 2.0. Could you perhaps link me to some examples of the technique you've described? – user1617735 May 06 '17 at 05:35
  • 3
    Pretty much every single shader on http://shadertoy.com and http://glslsandbox.com are exactly what you're asking for. They generate an image using a shader who's only input is UV coord that goes from 0 to resolution and usually they divide that by resolution to get a value from 0 to 1. – gman May 06 '17 at 07:15

0 Answers0