I’ve recently started to learn shaders and I’m trying to write a shader that :
The shader modifies the vertices in the first pass and that pass works as intended. The problem is that, when the subsequent passes are performed, i.e the Unity standard shader passes, these passes do not reuse the modified vertex coordinates !
As you can see on the picture, my first pass properly updates vertices, but the following passes ignore the updated values.
My question is, how can I make the following passes reuse the updated vertexes ? Do I have to modify every single pass of the unity standard shader to also update the vertices or is there some kind of magic command that I have not learned yet ?
After a lot of pain in my ass, I finally managed to create GPU simulated water with proper normals.
The solution in a nutshell is : don’t try to save the computed vertices from a pass to another in order to use it for normal computation. Instead, compute fake “neighbor” vertices (assuming that it is possible to recompute them with the available data, which in my case is possible) and use them to recompute the normals. Basically this means 5 times the work for every vertex if you take 4 neighbor vertices to compute an averaged normal based on two cross products.
There are more things to do beyond that in order to use the computed normals. In a nutshell again, it requires to write a surface shader that also has a vertex function. The vert function receives an appdata_full structure as “inout” and assigns both the vertex and the normal into it.
See the following youtube videos :
The performance gain of doing this using the GPU instead of the CPU using the Unity API is ridiculous. I can animate about 1 million triangles à 65 FPS using an AMD R9 280 3 Go. I would probably get something like 0.001 FPS if I was doing this with the CPU.