Hi,
I’m currently working in 3ds Max at a very small scale. Importing my models into Unity isn’t a problem and although they appear small, I can easily scale them up. The problem occurs when I want Unity to calculate the normals. I get black areas appearing on the mesh where the poly density is high (i.e. where the polygons are smallest)
Attaching an FBX for reference. It’s a simple plane featuring big and small polys to illustrate the problem. Will work fine with default settings but not when normals are set to Calculate.
Any help with this would be much appreciated as scaling the objects in Max isn’t really an ideal solution.
Many thanks.
well… I took a look at it in Maya and the Normals all seem fine (pointing in positive Z).
The scaling factor on import is easily corrected by selecting the prefabmodel and adjusting the import settings (scale factor) to 1.
Leaving the “Normals” setting to import… it works just fine for me as the normal are already baked in by 3DS… so I do not understand why you would want Unity messing around with that… and after forcing it to calculate, it screws them up. So why not just use those already baked? (same goes for tangents if they existed in the model)
I took a look as well, but in Unity. I made a simply script to “inspect” single triangles of that mesh. It turns out that Unity simply wasn’t able to calculate the normal because some edges of the triangles are smaller than “0.000008”. Unity uses floats (32 bit single precision floats) and you almost reached the smallest representable value of about 0.0000001.
However the real problem arises when you try to calculate the criss product of two edges as the resulting numbers can’t be represented anymore and would be rounded down to (0,0,0). Unity just uses the vector (0,1,0) in this case, at least that’s what your calculated normals look like. Like @tswalk said, if you exported normals, and they are correct, why do you want to recalculate them in Unity?
You simply hit the technical limit of floating point math.
ps:
I probably shouldn’t say that, but Unity could actually fix that problem by first normalizing the edge vectors and then cross multiply. However that would mean you have to normalize 2 or 3 times per normal instead of 1 time. For most “normal” meshes this would be unnecessary overhead. If you want you can calculate the normals manually that way, but i would suggest to simply not use such a small mesh as this could give you other problems as well with physics / MeshCollider / …
Many thanks for the quick responses to this question!
Just to clarify - I can’t really scale my models up in 3ds Max as this would cause issues further back in the pipeline. If I could (and I wish it were that simple) this would be the ideal solution!
I also can’t use the normals calculated by 3ds Max as I’m morphing (using blend shapes). I need Unity to calculate the normals in order for the models to display correctly when morphed. Imagine a box morphing into a sphere. Using imported normals would still show the hard edges of the box even when displayed as a sphere. If you tell Unity to calculate the normals then you get a nice smooth sphere.
Thanks again.