an equivalent of OnMouseUp/Down for Touch?

this is something i have never seen. even checked the new docs. maybe im missing something.

but i would have thought there would be a function called OnTouchDown, OnTouchUp for well, touch control, the same way mouse control works etc. but is there an equivalent of OnMouseDown/Up for touches?

thanks for your help

to make this clearer in a duff function sense

void OnTouchDown(){
   //do code stuff when finger has touched down on screen
}

void OnTouchUp(){
   //do other touch code but when pulling finger off of touchscreen
}

void OnTouchDrag(){
   ///do code stuff when you drag your finger. could even be OnTouchMove instead
}

thanks

1 Like

AFAIK OnMouseDown detect first touch on mobile devices

The TouchPhase might interest you. I tend to make my own touch manager that reads Input.touches than check the phase of the touches or do any detection of gestures that I want, than I have this manager fire off some events that other objects can subcribe to.

Rather simple approach but I like working with C# events for these kinda systems

oooh thanks a bunch guys, ill have a look see. forgot about TouchPhase. but really would have thought that unity would have something like OnTouchUp/Down/Drag etc. seems to make a bit of sense in my mind, but probably not in everyone elses :wink:

thanks again

1 Like

When I get home I could show you how to make a manager script that would allow you go subscribe any object you want to OnTouchDown, and OnTouchUp.

Though you will have to decide how to handle multiple touches.

that would be more than fantastic.

touch stuff is just not my first thing really. and without something like say OnTouchUp/Down etc, it just a bit like walking in a forest blindfolded.

but that would be very nice. thank you

Ya I get home to my main computer in a few hours and can just pull a sample from one of my projects.

It would help knowing the purpose, ui work or are you trying to do something like raycast to the world when ever there is a touch event?

1 Like

hey mate, little quick and dirty but you could try this.

put this code on a object in your scene, and make sure there is only one instance of it.

using UnityEngine;
using System.Collections;

public class TouchManager : MonoBehaviour
{
   #region singaltonStuff
    private static TouchManager _instance;
    public static TouchManager Instance
    {
        get
        {
            if (_instance == null)
                _instance = GameObject.FindObjectOfType<TouchManager>();
            return _instance;
        }
    }
    #endregion

    public delegate void TouchDelegate(Touch eventData);
    public static event TouchDelegate OnTouchDown;
    public static event TouchDelegate OnTouchUp;
    public static event TouchDelegate OnTouchDrag;

    private void Update()
    {
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);
            if (touch.phase == TouchPhase.Began)
            {
                if (OnTouchDown != null)
                    OnTouchDown(touch);
            }
            else if (touch.phase == TouchPhase.Ended || touch.phase == TouchPhase.Canceled)
            {
                if (OnTouchUp != null)
                    OnTouchUp(touch);
            }
            else if (touch.phase == TouchPhase.Moved)
            {
                if (OnTouchDrag != null)
                    OnTouchDrag(touch);
            }
        }
    }
}

than in the class you want to implemeant your OnTouchDown, OnTouchUp, OnTouchDrag methods in simply subcribe to the events in your Start, Awake or OnEnable Methods with a method that fits the delgate signature.

using UnityEngine;
using System.Collections;

public class ReadTouch : MonoBehaviour
{
    private void OnEnable()
    {
        // Subcribe to events when object is enabled
        TouchManager.OnTouchDown += OnTouchDown;
        TouchManager.OnTouchUp += OnTouchUp;
        TouchManager.OnTouchDrag += OnTouchDrag;
    }

    private void OnDisable()
    {
        // Unsubcribe from events when object is disabled
        TouchManager.OnTouchDown -= OnTouchDown;
        TouchManager.OnTouchUp -= OnTouchUp;
        TouchManager.OnTouchDrag -= OnTouchDrag;
    }

    private void OnTouchDown(Touch eventData)
    {
        Debug.Log("OnTouchDown!");
    }

    private void OnTouchUp(Touch eventData)
    {
        Debug.Log("OnTouchUp!");
    }

    private void OnTouchDrag(Touch eventData)
    {
        Debug.Log("OnTouchDrag");
    }
}

this should do the job for you, and i added in the Touch eventData so you also get access to information about a touch in the event callbacks.

Like i mentioned before if you are doing multi touch you would have to decide how to handle extra touchs, such as just makking the manager loop all touches and fire events for all of them, or putting some code in there to handle gestures.

6 Likes

You saved my game MAN!!!

Thanks passerbycmc for your wonderfull code, Since OnTouchDown etc… didn’t worked for me but your code did.
Love you and your code.
you saved me and my game

Nice necro of an old thread.

You can implement interfaces that handle these kinds of events.
IPointerDownHandler, IPointerUpHandler, IPointerClickHandler, IDragHandler, IDropHandler

This makes sense, I saw a tutorial on Interfaces but I’m kind of lost.
Can you please do an example of like IpointerDownHandler to get an idea

Unity kind of forgot the examples in the new documention. Only IPointerClickHandler has an example there.
https://docs.unity3d.com/Packages/com.unity.ugui@1.0/api/UnityEngine.EventSystems.IPointerClickHandler.html

Older documentation gives the examples as well:
https://docs.unity3d.com/2019.1/Documentation/ScriptReference/EventSystems.IPointerDownHandler.html

You can also use Unity Learn and find information about Interfaces there are a lot of C# scripting tutorials around.

Please help me!!
I want to display name of the object that is touched in the console. Like i have dog, cat, sheep etc. If i touch each animal, its name should be displayed on the console.
I provided unique tag for each animal. I attached the TouchManager script to these animal objects and ReadTouch script to the LeapHandController. I initially tried with one animal, but touch is not recognized and nothing got printed on console. I do not what to do next. Please help me…
I am using Leap motion in unity 5.6.
Thanks.

If you wish to interact with objects within Unity using Leap motion hand tracking I recommend importing the Unity Modules, specifically the Interaction Engine.

The Interaction Engine does some or all of the work for you to allow users to be able to hover near, touch, or grasp objects in your application in some way, the documentation can be found here and the Unity Modules downloaded from here.

If you are using an older version of Unity, such as 5.6, you can find a list of previous Unity Asset releases here.

Thanks for the reply.
Will try with Interaction Engine

Ugh. This looked promising and it’s still hard to believe that coding an app for mouse/touch doesn’t just ‘work’. Well, using OnMouseDrag does work for touch, but generates errors and warnings about being deprecated or hurting performance.

The latest documentation does not seem to have IPointerClickHandler, at leas that I can find:

https://docs.unity3d.com/2020.2/Documentation/ScriptReference/UIElements.IPointerEvent.html

I think I will just implement the TouchManager from above, and set it up to implement OnMouse functons in the editor and Touchmanager on the device. Such a common need, why is this so hard for Unity to implement properly?

The documentation you’re referring to is from “UI Elements” which has already been renamed to UI Toolkit.
And UI Toolkit is a whole different package that uses its own Input System AFAIK.

It completely depends on what your requirements are. But IPointerDownHandler, IPointerUpHandler and IPointerClickHandler are easy to implement. But the documentation has been reworked since 2019.2. This is due to the fact that uGUI became a package and since 2019.2 refers to the package documentation. It ain’t as extensive as the old documentation. The old documentation is still valid though as not much has changed.

If you need these events on a 3D mesh game object you’ll require a Physics Raycaster on your Camera.
If you need them on a UI game object made with uGUI, they should work already.
Don’t forget to have an event system in your scene (mesh or not). “GameObject/UI/Event System” menu
I can agree on that the documentation isn’t clear but it ain’t that hard to implement these events once you know how.

Thanks! This got me thinking in the right direction and I ultimately solved my problem.

Boy howdy though! The documentation on everything is just so confusing to sort through. This is for an AR project and so I thought surely the AR Foundation Samples would have an example implementation using a similar method. They do – the InputSystem_PlaceOnPlane.cs script and uses the ARRaycastManager that’s already attached to the camera in my scene. Great!

Then, after dragging the script into my project I remembered why I didn’t already use that example - it uses the New Input system and upgrading an existing project is painful. I was already using the UI Toolkit, so decided to take the plunge and update the Input system.

Not only do you have to update all your events, you need to create profiles and settings, add an Input manager, double-click various settings and set up the Events. It’s quite an elaborate amount of steps and I have no idea where to find the documentation, i.e. UI Elements, Unity UI Packages, etc. The best I could do was to compare the AR Foundation example scene and just hook everything up as they did.

Painful, but it’s all working now and I finally got rid of my OnMouse methods!