My models are being rendered black when I use a scaling transform to reduce the size of the model in my scene. It starts going wrong when I uses a scale of less than about (1/14677). I suspect the cause is probably to do with numeric precision being lost somewhere in the render pipeline, but it surprised me that things start to go wrong at the scales I’m using as they don’t seem that extreme.
I’ve cooked up the simplest possible example I could just using a simple Unity cube, and a script that dynamically scales up the vertices and scales down the transform, and this allowed me to pin down the exact scaling value where the mesh starts appearing black. See the two screen grabs below, and the script I used below them.
My question is… Is there anyway to avoid my scaled down models being rendered black? I mean a way other than modifying the mesh applying the scaling directly to the mesh vertex positions?
But really why would you need to scale it that low? It would probably be better to find an alternative way to achieve what you want without scaling it down to such an extremly small number.
Is it that low? What number would you consider reasonable 10, 100, 1000? One of the reasons I’m using this kind of scale is that I import models at runtime, and they come in arbitrary units. So, if for example, I loaded a file that used millimeter units, I would already have scaled that down by 1000 to get it into meter units, so if I then resize that, it doesn’t leave me much wiggle room to scale it down further i.e. if I want to resize it down to to 1/15, it then goes black unless I go to the kerfuffle of directly modifying the vertex positions to accommodate the units scaling from millimetres to metres.
So, I’m really just asking if there is a way to achieve what I want without having to fiddle with the actual mesh data after I have loaded a model at runtime? No big deal if not.
If you consider it a bug you can file a bug report. But I would think it’d be easier to just avoid that kind of scaling. Perhaps you could scale the mesh data directly after you are loading it. Would increase loading time by a little but shouldn’t be too bad I think.
Alternately - the model could be scaled correctly in dcc tool and then no scaling needed on import.
I agree with eX - avoiding or solving a problem before it gets into the engine is best approach.
Also agree if this looks like a bug - a bug report should be filed.
ok, rescaling the vertex data directly it is, thanks for your replies.
I’m still feeling curious though, about exactly why this effect occurs because in theory at least, the transformation of normals should be unaffected by the scaling transform. What I mean is, shouldn’t the transformation of a normal for localspace to worldspace only involve the rotations? If not, it would seem an extra normalisation of the worldspace normal would be required per vertex per frame?