Unity Remote returns OnMouseDown as well as Touch

I built my own GUI for my iOS game and I’m attempting to support touch and mouse input so I don’t have to change anything when I change platforms. (Ease of testing, sending web builds, have touch work, etc). So I have my buttons have box colliders and I have defined OnMouseDown for them so the Mouse input works and when I get touch input, I cast a ray from the GUI camera and if I hit a button collider, I send a message to it that it has been pressed. It works great but I’m running into an issue with Unity Remote where when I touch the screen on my device, I get BOTH a touch AND OnMouseDown. So I end up operating on the input twice. I want to use Unity Remote to test touch input and the best work around I can think of is this:

void OnMouseDown()
{
		if (Input.touchCount > 0)
			return;
		else
			ButtonDown();
}

I was wondering if anyone out there has a more elegant way of handling this situation? Thanks!

I find that this approach doesn’t work for OnMouseUp though.

That is normal and by design actually that you get both as touch 0 on iOS / android fires the mouse events along it.

The easy and clean way to handle it is using the platform dependent compilation ( see manual) and actually implement the handlings seperate by completely cutting the OnMouseDown() on iOS / Android while cutting the fucntion / code processing the touches complete on win - osx standalone and webplayer :slight_smile:

That means I either have to use Unity remote all the time or not at all until I’m ready to do iPhone controls exclusively though, which in my opinion, isn’t optimal. I don’t want to have to fire up Unity Remote all the time. Sometimes I just want to test other functionality with the mouse.

#if UNITY_WEBPLAYER
OnMouseDown()

OnMouseUp()
#endif

#if UNITY_IPHONE
void Update()
{
     foreach (Touch touch in Input.Touches)
          ...
}

I’m not entirely sure why Unity Remote has to return both. From the documentation: “IMPORTANT: This function has no effect on iPhone.” so why should Unity Remote for an iOS device be different?

I just hit this very same problem and was very surprised to say the least. It should surely only deliver one set of inputs rather than two. I wrote my app with multitap support and tested it in the editor using a mapping of the mouse input to my control system. This control system also takes input from touches and so I set it up to have index 0 as the mouse and 1…n for the fingers. I then run it through the editor with Unity Remote and I get what is effectively three fingers - the two fingers and a third one midpoint between them. I’ve never heard of any input system sending two different events for the same action. This really is a bug or at least must be switchable.