I’m about to start pulling my hair out because of this one. I’m trying to create an animated waterfall for our low-poly project. I can achieve exactly the effect I want in Blender by using a Displace modifier, pointing it at a cell noise texture and translating the UV map. I’ve attached the .blend-file so you can see what I mean. I especially like how it has this stop-motion feel to it.
Alas, I can’t export this to Unity. So here’s what I tried:
store smooth normals of each vertex in the vertex colors before exporting the mesh (needed below)
change shading back to flat, thereby splitting vertex normals (required for that low-poly look)
export mesh to .fbx format, import in Unity
add a greyscale noise texture to the material
in a vertex shader: translate the vertex along the smooth normal by an amount determined by the texture
animate the texture offset
This gives me almost the look I want. Unfortunately, moving the vertices in a shader doesn’t update the face normals, so the lighting is much less striking than the results I get in Blender. The waves are hardly visible except where the water intersects another mesh.
I spent an entire day trying to get correct normals onto the displaced mesh, but it seems to be quite impossible. I can’t recalculate them in the shader, as I simply don’t have access to the necessary information (vertex shaders only process a single vertex at a time, no info about neighbouring faces or vertices). Calling RecalculateNormals every frame in a Unity script does nothing, because the displaced vertices from the shader are never actually saved to the mesh. Moving the vertices in a script instead of a shader is prohibitively expensive because of all the array copying involved.
So my question is, can an effect similar to what I’m trying to do be achieved by some other means?
Update: I stumbled upon this Low Poly Water solution, which also transforms vertices in the vertex shader but still calculates normals correctly, all without the use of a geometry shader. So it must be possible to do this, I just don’t know how (and am kinda hesitant to spend those 15 bucks just to find out). If this thread is more appropriate in the shader subforum, someone please move it for me…
Why have smooth normals in your mesh if they will all point upwards?
Anyway look into normal calculation from a heightfield texture. Essential you derive the normal from the slope of the hieghtfield… for flat shading it might be a bit more complicated… I dunno maybe precompute the centre of the triangle and store it a UV and sample the hieghfield texture/function slope at that point?
Why would they all point upwards? My waterfall is partly vertical. For a flat water surface I agree, there would be no reason to store the normals in the vertex colours.
I don’t think storing the centre of the triangle is enough to caclulate the new normal, since the other two vertices might also have been transformed. The only way to know by how much and in which direction is to keep their uv-coordinates on hand… and their respective smooth normals (assuming non-flat mesh)… I think I’m running out of space to pass this stuff to the shader right about now.
I was wondering if there might be a better method to animate the mesh rather than using displacement on the shader level. Something that achieves roughly the same look I want. Anyways, I did give translating the vertices in a MonoBehaviour script a try, and it’s not actually as expensive as I’d thought. I haven’t checked the performance on Mobile yet but I’m hoping it’ll work (fingers crossed).
So at this point figuring out the “proper” way to do this is a matter of professional curiosity only. I’d love to know, but can’t really justify wasting too many resources on it at the moment.
It’s more that waterfalls and other really not flat water (before ripples) isn’t really helped by a translated displacement texture representing ripples.
A triangle only has one centre, common to all 3 vertices. The normal would not be the same as one computed from the vertex coordinates (unless your texture is too detailed, and not filtered by say using a lower mip level), but that doesn’t necessarily make it less correct as an approximation for the orientation of the surface used for lighting, and even if it did, would it matter if it was close enough to look right?
There are dozens of techniques, you’re approach to flat shading isn’t even the only one you could be doing necessarily.
You could look at the original Cardboard SDK for Unity - I believe that has a flat shaded water effect.
15 bucks seems like a tiny investment for something that has taken almost an entire days work. That’s like equaling your time investment to one dollar per hour.
If this is your going rate for contract work - you are going to be swamped with project offers!
It is for the effect I’m going for. Here’s a gif of my results (using a script to displace the vertices), I quite like it:
All 3 vertices of the triangle may be translated by the ripple effect. I have no way of knowing where the other 2 ended up without their original location and their uv-coordinates. The old center of the triangle does not help me compute the (correct) new normal.
Thanks, I’ll take a look at that.
Well, now that I have a working solution it is kinda much just to satisfy my curiosity. Also, I’m an intern, so I don’t get pain all that much and am expected to spend at least some of my time acquiring new skills. It has been a very interesting project so far, so I can’t complain.
Edit, because I couldn’t resist commenting on that math: You work 15 hour days? Wow.