Invalid Matrix

I’m trying to make my GUI resolution independent (to scale with the resolution). I found that everyone has found a solution to this problem, But I got a problem with the solution.
I think most of you know that piece of code here:

GUI.matrix = Matrix4x4.TRS (Vector3(0, 0, 0), Quaternion.identity, Vector3(Screen.height / nativeVerticalResolution, Screen.height / nativeVerticalResolution, 1));

var scaledResolutionWidth = nativeVerticalResolution / Screen.height * Screen.width;

The problem is that if I use it in my OnGUI function (ofcourse with a value instead of “nativeVerticalResolution”), Unity gives the following error:

Does anyone know a solution?

This error message means that your Vector3(Screen.height / nativeVerticalResolution, Screen.height / nativeVerticalResolution, 1) evaluates to something wrong. Check the values of Screen.height and nativeVerticalResolution, if one of them (or both) is either 0, NaN or Infinity that’s your problem.

Well, I checked it and nothing’s 0, null or infinity. My game’s graphics will be designed for a 1024 x 768 resolution, so the nativeVerticalResolution in my case is 768. The problem occours when I resize the game window to a size with height, smaller than 768.

OK, I’ve been trying to fix this whole night and today and no sucess. No idea what’s wrong, I printed the values and I don’t see something wrong. Here’s my code:

GUI.matrix = Matrix4x4.TRS (Vector3(0, 0, 0), Quaternion.identity, Vector3(Screen.height / nativeVerticalResolution, Screen.height / nativeVerticalResolution, 1));

var scaledResolutionWidth = nativeVerticalResolution / Screen.height * Screen.width;

print(nativeVerticalResolution);
print(Screen.height);

This doesn’t return anything unusual. nativeVerticalResolution is defined in the beginning of the script like:
var nativeVerticalResolution : int = 768

I’m with Unity 3 if that matters.

Ah sorry, I should have noticed right away. Dividing two ints gives an integer result, meaning that Screen.height / nativeVerticalResolution will evaluate to zero whenever Screen.height < nativeVerticalResolution. For every single integer division cast your ints to floats before dividing, like this:

(Screen.height as float) / (nativeVerticalResolution as float)

Thanks for the fast reply! That helped! I’ve never seen that “as” operator before in any scripting language (lack of experience) and never used it, but it didn’t work with it, so I just used parseFloat().

I was using ScaleAroundPivot and in the number of cases this error message appeared in the console:

"Ignoring invalid matrix assinged to GUI.matrix - the matrix needs to be invertible. Did you scale by 0 on Z-axis?"

However, since using only GUIUtility.ScaleAroundPivot , I didn’t scale on Z-axis.

The reason for the error was that in a particular moment I had a zero value for scale x or y (I had some tweening in place starting from 0).

Try this:

var scale = new Vector3(0f, 0f); // zero scale
GUIUtility.ScaleAroundPivot(scale, pivotPoint); // pivotPoint doesn't metter here

So, for my tweens, I implemented the internal check which fixes this problem dynamically:

if (scale.x == 0)
	scale.x = MinScaleX;
if (scale.y == 0)
	scale.y = MinScaleY;

Where MinScaleX and MinScaleY are static variables having default of 0.0001f (the exact numbers don’t metter, they just need to be greater than 0).

Hope this helps! :slight_smile:

I got the exact same issue. Sometimes it happened and sometimes not. Reason was also a tweening value using iTween.valueTo(), even though it started at 1.0F. Obviously the updated value from the iTween method (which is published via SendMessage) wasn’t quickly enough and for the first 1-2 frames OnGUI caught a 0 value.
It works with the minScale workaround though.