I tried adding the x,z values of the normal vector in the opposite direction but it feels like it should be some kind of crossproduct with the camera? Here is the attempt (doesn’t change much from the above result):
Make sure the normals have been smoothed on your mesh before importing it. If you have hard-edges on your model in your modeling program, every hard edge is split for rendering hard on the GPU, resulting it unique verts for each triangle, allowing the triangles to separate if not moved together in a uniform direction.
Mesh>Faces>Smooth.
But yes you will lose the low-poly triangle shading look then.
You could instead compute flat shading in the shader though by using screen-space derivatives, bgolus has posted examples of such a thing as well
Or if you are not utilizing the vertex colors of your mesh, then you could store the smooth-shading normals in your vertex colors: Blender Addon: Normals to Vertex Colors – Philipp Seifried
And then in your shader graph, instead of using the “Normal Vector” node, you would use “Vertex Color” node.
Probably instead of baking normals, a more simple way is to displace all vertices in the same direction. If it’s always on the floor then just displace in world’s up (0,1,0). For universal use displace towards the camera (-view direction). I believe you can get it from the view matrix in HLSL, I don’t know where it is in shader graph.