I don’t understand why Unity coordinate system is bottom-left based, and then you have a structure Rect which is top-left based. Very wierd. As far as I understood the whole gui system is also top-left based.
If you use those things in one function, wow things get messy. I just wanna hear opinions how you guys deal with this.
I’d guess most programmers first saw C# with free MicroSoft VisualStudio, or maybe Microsoft’s free C#/XNA for games (sort of.) The “Forms” environment sets up a standard, easy to use, drag-and-drop GUI menu system (the kind of thing I imagine the EasyGUI add-on does for you – I have no involvement with EasyGUI.)
Microsoft C# uses a top-left, units-are-pixels coordinate system for everything. It seems pretty odd, but back in the early days of web-pages, the layout engine always worked upper-left down and across. The bottom was just where ever it landed. I believe javascript also used top-left coords for the same reason (it was originally created to add some code to web pages.) Even things like “50% across” were suspect, since you didn’t know the width until you wrote the entire page and knew the longest line.
The Unity coord system gets even worse. GUILabels use viewport coords: 0-1/0-1 screen percents, from bottom-left. Camera.WorldToScreen gives ScreenCoords, which are in pixels, but centered bottom-left, so you have to flip Y to use it for RectCoords.
The thing is, we’re already converting Z-up (3DS-Max) to Y-up, feet to meters, 2D grid centers into world coords, ViewPort to Screen, degrees to radians. I believe it’s pretty normal to assume there will be a conversion, and to be in the habit of always thinking about which coordinate system you’re in and which you need.