Raycast into GUI?

when i used ngui, i used raycasts a lot to detect what gui elements are “under” my mouse.
how can i do that with the new gui system since there is no camera i can shoot my raycast at?

thanks for your help!

1 Like

Right now I’m using a World Space Canvas and attached colliders to the GUI objects (a single button, for now) I want to hit with a raycast. It works. But now I can’t figure out how to use the Event system to broadcast selection events etc. to the UI system. Keeps freaking out over wrong type of the object I’m passing to it.

1 Like

For 3D colliders attach the PhysicsRaycaster to your camera. This will then allow the objects (cubes, spheres, etc…) to receive events such as OnDrag, OnPointerDown, and all the rest.

i do not want 3d colliders. i just want to detect if there is for example a button under my mousecursor. i added 2d boxcolliders to my gui items and the physics 2draycaster to my camera but my raycasthits stay empty.

I’ve only been poking at the 4.6 beta for a few minutes but it might be worth trying to add, for that scenario, an Event trigger component to the UI button, then click its ‘add new’ button, and choose ‘Pointer Enter’. Then click the plus to decide what you want to communicate with when that event is triggered.

I can’t say I’ve actually done anything with the new Event components yet but thats my initial hunch.

edit - tried it quickly and it worked as I expected, yay.

You want to call: IsPointerOverEventSystemObject on the Event system. It will return true if the cursor is over an event system object.

1 Like

I would still like to know how to do this with a Raycast. I am using a different input method that doesn’t use a mouse but I have screen coordinates and I want to be able to call “highlight” or “pressed” events on buttons that are raycast from those screen coordinates.

Hi,

To do this you want to write a custom InputModule for the UI. It’s the designed way to send events to objects within the UI system.

If you look here you will find our implementations for the pointer input modules:

Thanks so much for the reply. It’s unclear to me when the functions in the pointer module are getting called (e.g. GetMousePointerEventData(), GetTouchPointerEventData(Touch input, out bool pressed, out bool released)) Presumably to write a custom InputModule I’d need to have these functions registered/called somehow.

Well you don’t need to call them, you just need to create a valid event data (they are mostly helper functions). So for example you need to populate one with the delta and other data since last frame. We do it like this:

        protected virtual PointerEventData GetMousePointerEventData()
        {
            PointerEventData pointerData;
            var created = GetPointerData (kMouseId, out pointerData, true);

            pointerData.Reset ();

            if (created)
                pointerData.position = Input.mousePosition;

            Vector2 pos = Input.mousePosition;
            pointerData.delta = pos - pointerData.position;
            pointerData.position = pos;

            eventSystem.RaycastAll (pointerData, m_RaycastResultCache);
            var raycast = FindFirstRaycast (m_RaycastResultCache);
            pointerData.pointerCurrentRaycast = raycast;
            m_RaycastResultCache.Clear ();
            return pointerData;
        }

So we just create a PointerData, and populate it with position and delta then use it for a raycast. What you want to do is the same but instead of using a mouse use your screen point. You may not even need / supoort delta, but you could do it if you wanted.

I’m still pretty confused about how to do what the original poster requested. With NGUI, I used RaycastAll to get an array of hits under my mouse. With the new UI, I cannot seem to do this. I’ve looked over this thread and many more, but I’m just not getting it.

In NGUI, this is what I use to get all game objects under the mouse:

int layer = 1 << 15;
Ray myray = UICamera.currentCamera.ScreenPointToRay(Input.mousePosition);
RaycastHit[] hits = Physics.RaycastAll(myray, 1000.0f, layer);

Tim, I see the links to the two scripts and the sample above, but I’m not sure how to actually implement these. Are the two scripts functionality that is coming in a future version of 4.6?

1 Like

It is already in 4.6. It’s the source code of the TouchInputModule and StandaloneInputModule that are found on the “EventSystem” game object that gets created when you first create the uGUI in your scene. Those classes coupled with EventSystem script is responsible for the event handling of the uGUI.

1 Like

Okay, thanks! I almost have everything set up, but I’m getting an error. Here is my code. When I call RaycastMouse from another class, the debugger throws a System.NullReferenceExeption when it hits the line with GetMousePointerEventData. (I’ve not added a return value yet)

internal class MyPointerInputModule : PointerInputModule
{
    public override void Process()
    {
    }
  
    public void RaycastMouse()
    {
        PointerEventData ped = GetMousePointerEventData(); // I get a System.NullReferenceExeption here
      
        eventSystem.RaycastAll(ped, m_RaycastResultCache);
        List<RaycastResult> results = m_RaycastResultCache;
    }
}

Well you would be getting the error somewhere inside that function… Do you have a stack trace?

Btw, I created a very simple C# script that does test for mouse over using the event system, and it works great on my Mac, but when I export to an iOS device, it doesn’t…seems like a bug somewhere in the software.

Here is the script:

using UnityEngine;
using System.Collections;
using UnityEngine.EventSystems;

public class touchTest : MonoBehaviour {

    public int overUI = 0;


    public int ReturnUIState (){
        return overUI;
    }

    void Update () {
        if (EventSystemManager.currentSystem.IsPointerOverEventSystemObject ()) {
            // we're over a UI element... return 1
            overUI = 1;
            }
        else{
                overUI = 0;
            }
    }
}
2 Likes

Ah! That’s because the function takes an Id for which pointer. 1 is first finger, 2 is second finger ect. By default if you don’t specify an argument it uses the mouse left click id (-1)

1 Like

How to determine a non-rectangular button click event accurately

That is work! Thanks so much.

That code appears to be obsolete, or am I doing something wrong?

Hi,

I need to do the same, a simple raycast to know the Rect Transform based on a position in the screen.

Does anyone have a simple code to do this?

I am horrified to see that such a simple thing is so complicated to do. I asked more and more questions about the new GUI…

I hope that the documentation will be more explicit in the official release, with examples …

2 Likes