Has anyone got any experience doing Triplanar normal mapping with object space projection rather than world in Shadergraph? I’ve spent quite a bit of time trying to get this to work properly but I’m finding the maths to be an absolute pig. The built in node doesn’t seem easily adaptable to convert to object space so I’m trying to do it from scratch.
I’ve tried converting some of the techniques listed here into ShadeGraph but haven’t been able to get any real success in adapting them to a node setup. A swizzle technique has got me the closest but I know I haven’t done it correctly.
As well as making things hard on myself by using object spaced projection I’m also making it more complex by flipping the U axis of the projection on the backface so that any texture details aren’t flipped, however this makes the normal mapping a bit more fiddly!
The Shader Graph already has a triplanar node with a normal map option, and is based roughly off of my article. The code for the node even has a comment referencing the article. However in normal map mode it only works properly when using world space normals and positions, which my article is also written explicitly for. For Shader Graph makes things harder in part because the Triplanar node assumes the normals it calculates are always in world space, which if the inputs are not the world space position and normal isn’t true.
Doing object space triplanar with working normals is a slightly more difficult problem than world space triplanar. World space triplanar normal mapping ends up working out relatively simply in the end due to the final normals and the triplanar uvs aligning along axial directions nicely, which is why swizzling works at all. For surface shaders and Shader Graph, the o.Normal or master node Normal connection both assume mesh tangent space normals, so the Triplanar node for Shader Graph transforms the triplanar normal from world space to tangent space. As I mentioned above this is wrong if you’re not using world space inputs. For arbitrary object space triplanar normal mapping, the output of the triplanar blending is still in object space, so you have to transform the triplanar normals from object space to mesh tangent space, and not world space to mesh tangent space.
Unity doesn’t offer any transform matrices for doing object to tangent space, but you can access the individual vectors that make up the matrix on your own, but it also means you have to recreate the entirety of the Triplanar node in the graph itself.
Here’s an object space triplanar subgraph with inputs and outputs for albedo and normals. This uses the same basic setup as the Triplanar node that ships with Unity’s ShaderGraph, but without the incorrect application of the world space to tangent transform, but instead an object space to tangent transform.
This doesn’t have all the features that I implemented in my article, neither does the built in node, but it should serve as a good example. Maybe someday I’ll update or make another article with object space triplanar stuff.
Wow, thanks so much bgolus, I literally couldn’t have asked for a better reply! I really appreciate you taking the time to explain that so clearly, I’ll be looking forward to reconstructing your example subgraph shortly! There’s some completely new techniques to me here such as the Matrix Construction which there’s no way I would have worked out without assistance.
Hi bgolus, first thank you for your very instructive article on normal mapping for triplanar shaders. I am an artist very bad at maths and shader writing but I did get some more understanding of normal maps.
I wish to make a triplanar shader with normal maps for rotating asteroids (which are procedurally generated meshes).
I assumed the example graph you gave here was what I needed and would work for my case, however I re-did it within Amplify Shader Editor and I still get what looks like flipped normals on some faces when the object rotation is not 0,0,0.
Here is the graph and some screenshots, is there something I could adjust to correct the issue?
Also, as I say I am very bad at shader code, and I failed to reproduce in my graph the Height map triplanar blend you described in your article. Could you help me with it? that would be appreciated!
With this graph, the normal map reacts fine with my directionnal light on every side of the object, as long as this one is not rotated.
With ASE triplanar node, there are always normal details facing the wrong direction on some sides whatever the rotation values of the object.
Checking locally, using these settings, I didn’t see any problems with the normals on a default Unity sphere. I’m testing with a pretty old version of ASE though (1.5.3), so maybe it got broken at some point?
Just to double check, your asteroid mesh has vertex tangents, yes?
Also I still can’t see anything wrong with your node based version that would cause the issues you’re seeing, so the only thing I can think is the mesh’s tangents are bad.
My bad! Sorry. The problems was coming from my normal texture, exported in the wrong format from substance.
Its looks much better now! It works well with both ASE triplanar sampler and your graph. I will continue with this one, it gives more controle. Thank you for this.
Just one more thing, concerning the blending between each face, I tried to make a height map triplanar blend in ASE based on you method but I could’nt make it work using the nodes I am struggling with the maths. Could you help me with it?
You need a height map from each face using the triplanar UVs, but not blended together. So you can’t use the existing triplanar node here. I would suggest packing these in the alpha of the main albedo texture.
If you’re trying to replicate the version I have in my blog post, then take the height values directly from each texture and pipe them into the append node you currently have that triplanar node piped into. The other bug you have in that node graph is you’re using the original textures’ values for the inputs into the max node chain. That should be the max of the components from the add.
There’s also a number of single component floats that should be vector3 values to work properly. Some platforms won’t behave the same with floats (either outright error, or produce different results).
Thank you @bgolus for sharing your subgraph for object space triplanar. I’ve been trying to replicate it but I seem to have missed something. I’ve gone through the graph several times now but I can’t find the issue.
@cryptoforge Oh, and I’m not intentionally ignoring you. I can’t actually find that subgraph anymore. I’ve replaced my computer since I posted that and I don’t think it got copied over to my new machine.
It fixed most of the issue, but now I’m getting totally black faces on some meshes. It might be an issue with the mesh it self, though I don’t get it when using world space triplanar normal. That do you think?
Hmm. Super common issue to see with triplanar mapping. I’ve seen this bug on stuff in the asset store, and was a bug that popped up in some of the shaders I was writing for my article on it … it’s usually something small, but I can never remember exactly what it is that causes it. Probably some solitary value plugged in wrong some place. Sorry. Happens when the input normal is perfectly (0,1,0) (and always seems to be broken for that specific normal every time I see it too…).