I'm trying to animate large meshes representing tiles of water using the GPU.
So far I managed to update "on the fly" the vertices of a mesh. I mean on the fly because the mesh isn't actually updated, it remains flat, it's just its vertices that get modified right before rendering, using a vertex shader.
However, I also need to re-compute the normals to make it look realistic. Doing so requires me to have access to the position of other vertices than the current one. The 4 adjacent vertices would be good enough. However this can't be done in a single pass since a vertex shader only has access to the current vertex.
Thus, I would like to modify the vertices in a first pass to simulate the waves, store these updated vertices somewhere, and reuse them in a second path in order to update the normals as well.
Am I thinking about this correctly ? I just started to learn shaders yesterday so I'm a noob.
I read something about rendering to a texture which could allow me to share data between the first and the second pass. Is it the right way ? If so, how can I do that ? I failed to find code samples about this.
I'm trying to do this on Unity 5.6.3 if it matters. Thanks !