Two BIG GUI questions...

I’m trying to design a GUI (buttons and read-outs) that will work with almost any screen resolution and aspect ratio the user may have, but it’s getting pretty complicated. What really messes things is when I try to have lots of buttons (like a control panel) but then try to design it for both “wide screen” and “normal” 1.33 aspect ratio screens. So here are my 2 BIG questions:

  1. Is there some clever way of handling differing screen aspect ratios without resulting in distored GUI elements? I’ve written some code to maintain the element’s AR no matter what the screen resolution is, but the problem is that when a user has a Wide Screen display I end up with “vacant” screen space. :?

  2. Is there an easier way to handle -multiple- screen buttons other than using individual GUItextures which each have to be precisely postitioned and scaled? Wish there were some way to take a single image, slice and dice it (ala the way we do it for web graphics) and then assign “hot spots” which link to something. Is there an equivalent way in Unity?

Any BIG answers to my BIG questions? :lol: (Heck, I’d settle for little answers, too!)

  1. doing pixel-correct GUI kind of solves this (assuming pixels on the screen are actually square, of course). So if you want a bunch of stuff at “lower left corner”, set position to zero, and then for each element position it only with pixel insets.

  2. you could implement it yourself somehow, I guess. Have a single big element, and then detect the hit region yourself.

Thanks for the reply Aras.

  1. I’m not sure I’m following your first comment. I’m actually dealing with two issues: one is preventing distortion from one screen aspect ratio to another. The other issue is what happens when I design using “square” pixels, and that is, there ends up being a lot of “empty” space on a wide screen display. I never really considered this issue before, but now that I have an iMac (1440 x 900 pixels) and noticed that about 3/4 of the notebooks I’ve seen on the market also have widescreens, I’m pretty much forced to deal with it.

I found this blog post which discusses the issue and how they are handling it in web design. Basically they’re testing for the screen resolution at start up and then have a different screen layout for each resolution.

  1. Hmmm… I might be able to come up with something like that, but once you add the “roll over” and “mouse click” elements, it might get kinda complicated.

I was actually thinking that this might possibly be a good editor script: select a texture, enter the coordinates for the grid and then have it create the pieces and coordinates for you…

The first is not an issue on non-stretched resolutions. A 10x10 pixels is still square on both 1024x768 4:3 aspect monitor and 1440x900 16:10 aspect monitor.

Of course, bad things happen when you enter 1024x768 on a 16:10 monitor, but then everything is stretched, not just the GUI. Currently we don’t have a solution for this, but our plan is just to black-border these cases, much like in a game view.

The empty-space problem is entirely a GUI design question, not a technical one: “what do I do with this extra space”. Maybe anchor some part of GUI to left side, another part to right side, something to the middle, and on widescreen the gap between parts just increases. Maybe do something else.

Thanks Aras,

You’re absolutely right. I probably should have posted this in a different section of the forum, since there probably isn’t anything you could add to Unity that would fix this issue.

To be honest, I’m surprised that more people aren’t concerned with this issue. The world of computer users is slowly transitioning from what as a standard 4 by 3 monitor to various “widescreen” resolutions and aspect ratios. I’ve only seen a few games that are “wide screen” friendly, so does that mean everyone else is satisfied with distorted HUD’s and the like in their games?

For my particular situation I’ve decided that the only viable solution is to create a different GUI for the most common screen resolutions and aspect ratios.

Give away the GUI that Unity provides because is very poor and limited. Use 3D elements attached to the camera so they are always facing it and create a 3D GUI instead, this way you can resize and move the elements depending on the resolution without getting artifacts, in Unity you can even drag and drop 3D content!, i did a kind of a window manager on Unity but the code is pretty messed up, so i will remake it soon for the application i am in charge.

Omar Rojo

What most people seem to be doing is that they anchor the GUI elements to screen corners. We do this in GC:Palestine - if you increase res, the elements go further apart. Works pretty well in many cases.

what are peoples thoughts on vector gui’s?

http://www.tulrich.com/geekstuff/gameswf.html

Since I often design my gui’s in a vector app first (Corel usually) one of the things I was thinking was how much easier this would all be if I could just do it in Flash. Button roll-overs, scaled graphics, etc., everything would be so much easier if I could just place a .SWF to fill a portion of the screen and have it scale with the screen size and not worry about distortion.

So, yeah, I’d like to see it!

Chiming in here; I generally like the idea of vector GUIs for what they bring (though i tend to use 3D objects myself), but perhaps there are good ‘non-resource’ related reasons not to include them into Unity at this time:

For example, as I understand the situation, there are no free (or xplatform) tools to make usable GUIs into .swf files. This may not affect the professional user (who shells out for Flash) but remains yet another barrier to entry for new game makers.

Also, learning yet another new program (with it’s own interface and it’s own scripting language) in order to create GUIs in Unity might be a barring factor.

Either or both of these factors would tend to decrease Unity’s accessibility.

Just food for thought… :slight_smile:

About getting SWF support in:
I’m not really concerned about people having to buy flash. I also assume most people are using Photoshop, and I don’t feel bad about that either :wink:

However, This is not very likely to be a core Unity feature. One of the main reasons is that GameSWF “…has bugs and quirks, and is not a complete implementation of Flash…” I just don’t wanna have to give support to that kind of software. If we say we handle Flash, then we need 100% flash, not 70 or 80.

And since this is a hobby project from an industry pro (and I mean really pro ;-)) there’s absolutely no-one we can turn to , when somebody hits something it doesn’t do, there’s no-one for us to go to. Too soon we’d be working too much on re-implementing a Flash renderer. I’d much rather spend the time on Unity :wink:

That being said, if someone wants to pick this up, it would make perfect sense for a 3rd party plugin: It has it’s own complete functionality and very little contact with the main engine. If someone wants to do this, I’m sure there would be plenty of help hints. I like the idea, but integrating pre-alpha software into core Unity is just a no-go.