I’m trying to work on procedural planet rendering. My heightmaps are generated as a cubemap to eliminate pole distortion among other things, and I generate a planet-space normal map from that as well (the final heightmap is stored as RGB normal, A height).
In a shadergraph, I then use the surface normal to sample from this cubemap and retrieve the normals, ensure they are normalized, and then convert them to tangent space so I can plug it into a PBR Master node.
This works perfectly fine when my sphere is at unit scale. However, if I scale the object, the normals get distorted. This is on the latest ShaderGraph at the time of writing (7.1.5), the latest Universal Render Pipeline (7.1.5), and Unity 2019.3.0b12.
Here’s what the unit sphere looks like:
And then if I make the sphere smaller (normals increase in intensity):
And if I make the sphere larger (normals decrease in intensity):
For reference, here’s the result of just plugging the sampled normals directly into the Color input of an Unlit Master node (this doesn’t visually change regardless of the sphere’s scale, which is as I would expect):





