(mods/UTech: please feel free to modify and extend this post)
Can I release games made with the beta?
You can, but Unity recommend that you don’t rely on it - it’s a beta, which means there are bugs. But if it works well enough for what you’re doing then go right ahead. Note that beta currently does not include full Web Player support, as the 4.6 Web Player is in beta as well. Only other developers using the 4.6 beta will be able to play games targeting the Web Player, and this will display a red note.
How many more beta builds will there be before 4.6 is officially ‘finished?’
Nobody knows (no, really). It all depends on whether people keep finding bugs and what those bugs are. The UI team will continue releasing beta builds as needed to keep putting the latest changes into your hands for testing, and when the remaining issues are either too big for 4.6 or too small to worry about, they’ll call it done.
Which 4.5 changes are in the 4.6 beta and when do 4.5 changes get applied to 4.6?
You can see which 4.5 changes are in each beta on the release notes page. 4.6 will only pull changes from 4.5 when there is a 4.5.X release (e.g. 4.5.4), not patch releases (e.g. 4.5.3p3).
How do I stop clicks/touches on the UI from ‘going through it’ and being clicks in my game world?
Use EventSystem.current.IsPointerOverGameObject() to check whether the mouse is over a GUI element before you process your game world clicks. See this post for an example. Also, on mobile, you may need to specify which finger you’re asking about - see this thread for details.
How do I make a button have events for things other than OnClick?
Attach another component to the button that has those events. You could use the built-in EventTrigger component, or you can make your own component that implements the IPointerEnter and IPointerExit methods for a lighter-weight approach.
How do I use ColorTint with more than one Graphic?
Use the Animation transition mode to animate all your color changes, instead of using ColorTint. (Alternatively, you could derive your own component and override DoStateTransition and then implement whatever you need in there).
How do I make UI that works with the Oculus Rift?
Use ‘World Space’ mode on the Canvas so that the UI can be seen by both left-eye and right-eye cameras. You may want to make the Canvas a child of your OVRController object so that it moves with the player. For ‘look-to-point’ style input, just set the Canvas’s Event Camera to one of the two eye cameras - it’s slightly inaccurate but good enough in most cases. (If it isn’t, you’ll need to write a custom input module). See also this article.
How do I make my UI scale up and down to different screen resolutions? I tried using anchors but things become tiny at large screen sizes.
The anchor system can ensure that the ‘frames’ of graphics and text change appropriately as the screen size changes, but it doesn’t do anything about the content in those frames - it doesn’t adjust things like your font sizes, which is why they look tiny at higher resolutions.
Instead you want to use the Scale property to change the size; the simplest approach is to attach a ReferenceResolution component to your base Canvas, which will automatically scale the entire canvas up or down to fit the screen resolution. Note that you will probably still want to use the anchor system for dealing with different aspect ratios.
Please read the documentation HOWTO on this.
How do I make UI elements have non-rectangular hitboxes?
Add a component that implements ICanvasRaycastFilter. In your implementation of IsRaycastLocationValid, reject all points that are ‘outside’ your custom hitbox shape. It’s up to you how you want to do that, but for example for a circular hitbox you could measure the distance between the point and your object’s center and reject the point if that distance is greater than your circle’s radius. Or perhaps you could look up the alpha value of the pixel the hit is on, as per this script by senritsu.
How do I make UI elements be ignored by raycasting?
Either implement ICanvasRaycastFilter with an IsRaycastLocationValid method that always returns false, or use a CanvasGroup component with ‘Blocks Raycasts’ turned off.
How do I attach callbacks to events from code?
At runtime, use the addListener method to attach your callback - e.g. myButton.onClick.AddListener(MyCallback).
In the editor, use UnityEditor.Events.UnityEventTools.AddPersistentListener(myButton.onClick, MyCallback). This creates a ‘persistent’ listener, which can be loaded and saved as normal.
Be aware that the callback function you use for a persistent listener must be a member of a MonoBehaviour, ScriptableObject, or other UnityEngine.Object-derived class - this is required for serialization to work. If you wouldn’t be able to set up your listener by hand in the Inspector, you won’t be able to set it up via AddPersistentListener either.
How do I create a UI element from a prefab and add to to a Canvas?
Use transform.SetParent (parent, false). Do not use transform.parent = parent