I need to render an enormous amount of terrain using WebGL, so I'm going for texture synthesis approach. Now I've been reading tons of articles about texture synthesis, but I didn't find anything really GPU ready. Some use GPU to render into off-screen buffer to get the synthesized texture, but I need it to be rendered in real time in pixel shader.
The question is: Is there any way to write a fragment shader that receives the following:
- Sample texture
- UV coords
And according to that renders arbitrary sized synthesized texture. Or do I actually need to synthesize it on CPU and pass to GPU as ready texture?