this is something i have never seen. even checked the new docs. maybe im missing something.
but i would have thought there would be a function called OnTouchDown, OnTouchUp for well, touch control, the same way mouse control works etc. but is there an equivalent of OnMouseDown/Up for touches?
void OnTouchDown(){
//do code stuff when finger has touched down on screen
}
void OnTouchUp(){
//do other touch code but when pulling finger off of touchscreen
}
void OnTouchDrag(){
///do code stuff when you drag your finger. could even be OnTouchMove instead
}
The TouchPhase might interest you. I tend to make my own touch manager that reads Input.touches than check the phase of the touches or do any detection of gestures that I want, than I have this manager fire off some events that other objects can subcribe to.
Rather simple approach but I like working with C# events for these kinda systems
oooh thanks a bunch guys, ill have a look see. forgot about TouchPhase. but really would have thought that unity would have something like OnTouchUp/Down/Drag etc. seems to make a bit of sense in my mind, but probably not in everyone elses
hey mate, little quick and dirty but you could try this.
put this code on a object in your scene, and make sure there is only one instance of it.
using UnityEngine;
using System.Collections;
public class TouchManager : MonoBehaviour
{
#region singaltonStuff
private static TouchManager _instance;
public static TouchManager Instance
{
get
{
if (_instance == null)
_instance = GameObject.FindObjectOfType<TouchManager>();
return _instance;
}
}
#endregion
public delegate void TouchDelegate(Touch eventData);
public static event TouchDelegate OnTouchDown;
public static event TouchDelegate OnTouchUp;
public static event TouchDelegate OnTouchDrag;
private void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Began)
{
if (OnTouchDown != null)
OnTouchDown(touch);
}
else if (touch.phase == TouchPhase.Ended || touch.phase == TouchPhase.Canceled)
{
if (OnTouchUp != null)
OnTouchUp(touch);
}
else if (touch.phase == TouchPhase.Moved)
{
if (OnTouchDrag != null)
OnTouchDrag(touch);
}
}
}
}
than in the class you want to implemeant your OnTouchDown, OnTouchUp, OnTouchDrag methods in simply subcribe to the events in your Start, Awake or OnEnable Methods with a method that fits the delgate signature.
using UnityEngine;
using System.Collections;
public class ReadTouch : MonoBehaviour
{
private void OnEnable()
{
// Subcribe to events when object is enabled
TouchManager.OnTouchDown += OnTouchDown;
TouchManager.OnTouchUp += OnTouchUp;
TouchManager.OnTouchDrag += OnTouchDrag;
}
private void OnDisable()
{
// Unsubcribe from events when object is disabled
TouchManager.OnTouchDown -= OnTouchDown;
TouchManager.OnTouchUp -= OnTouchUp;
TouchManager.OnTouchDrag -= OnTouchDrag;
}
private void OnTouchDown(Touch eventData)
{
Debug.Log("OnTouchDown!");
}
private void OnTouchUp(Touch eventData)
{
Debug.Log("OnTouchUp!");
}
private void OnTouchDrag(Touch eventData)
{
Debug.Log("OnTouchDrag");
}
}
this should do the job for you, and i added in the Touch eventData so you also get access to information about a touch in the event callbacks.
Like i mentioned before if you are doing multi touch you would have to decide how to handle extra touchs, such as just makking the manager loop all touches and fire events for all of them, or putting some code in there to handle gestures.
Thanks passerbycmc for your wonderfull code, Since OnTouchDown etc⌠didnât worked for me but your code did.
Love you and your code.
you saved me and my game
You can implement interfaces that handle these kinds of events.
IPointerDownHandler, IPointerUpHandler, IPointerClickHandler, IDragHandler, IDropHandler
Please help me!!
I want to display name of the object that is touched in the console. Like i have dog, cat, sheep etc. If i touch each animal, its name should be displayed on the console.
I provided unique tag for each animal. I attached the TouchManager script to these animal objects and ReadTouch script to the LeapHandController. I initially tried with one animal, but touch is not recognized and nothing got printed on console. I do not what to do next. Please help meâŚ
I am using Leap motion in unity 5.6.
Thanks.
If you wish to interact with objects within Unity using Leap motion hand tracking I recommend importing the Unity Modules, specifically the Interaction Engine.
The Interaction Engine does some or all of the work for you to allow users to be able to hover near, touch, or grasp objects in your application in some way, the documentation can be found here and the Unity Modules downloaded from here.
If you are using an older version of Unity, such as 5.6, you can find a list of previous Unity Asset releases here.
Ugh. This looked promising and itâs still hard to believe that coding an app for mouse/touch doesnât just âworkâ. Well, using OnMouseDrag does work for touch, but generates errors and warnings about being deprecated or hurting performance.
The latest documentation does not seem to have IPointerClickHandler, at leas that I can find:
I think I will just implement the TouchManager from above, and set it up to implement OnMouse functons in the editor and Touchmanager on the device. Such a common need, why is this so hard for Unity to implement properly?
The documentation youâre referring to is from âUI Elementsâ which has already been renamed to UI Toolkit.
And UI Toolkit is a whole different package that uses its own Input System AFAIK.
It completely depends on what your requirements are. But IPointerDownHandler, IPointerUpHandler and IPointerClickHandler are easy to implement. But the documentation has been reworked since 2019.2. This is due to the fact that uGUI became a package and since 2019.2 refers to the package documentation. It ainât as extensive as the old documentation. The old documentation is still valid though as not much has changed.
If you need these events on a 3D mesh game object youâll require a Physics Raycaster on your Camera.
If you need them on a UI game object made with uGUI, they should work already.
Donât forget to have an event system in your scene (mesh or not). âGameObject/UI/Event Systemâ menu
I can agree on that the documentation isnât clear but it ainât that hard to implement these events once you know how.
Thanks! This got me thinking in the right direction and I ultimately solved my problem.
Boy howdy though! The documentation on everything is just so confusing to sort through. This is for an AR project and so I thought surely the AR Foundation Samples would have an example implementation using a similar method. They do â the InputSystem_PlaceOnPlane.cs script and uses the ARRaycastManager thatâs already attached to the camera in my scene. Great!
Then, after dragging the script into my project I remembered why I didnât already use that example - it uses the New Input system and upgrading an existing project is painful. I was already using the UI Toolkit, so decided to take the plunge and update the Input system.
Not only do you have to update all your events, you need to create profiles and settings, add an Input manager, double-click various settings and set up the Events. Itâs quite an elaborate amount of steps and I have no idea where to find the documentation, i.e. UI Elements, Unity UI Packages, etc. The best I could do was to compare the AR Foundation example scene and just hook everything up as they did.
Painful, but itâs all working now and I finally got rid of my OnMouse methods!