Unity remote and touch phase ended

Hi,
It looks like thah Unity remote ignore touch phase ended. Or am I wrong?

Unity remote is not very accurate when it comes to detecting input in general.
You can get improved results by turning off “show image” at the bottom of the screen in the remote start screen.

Honestly, even builds running on the actual device will ignore “touch phase ended” pretty often. In my last game I was forced to write my own touch phase tracking system to have something 100% reliable.

Thanks.

Now I also noticed it too. I am doing a simple buttons for the racing game. Can differently detect phase ended?

Keep track of a boolean for determining the state of your button. If input is detected on the button, flip said variable to “true”. If Input is NOT detected on your button, but the variable is still true, then you know the finger has been released. So flip it back to false and do whatever phase.ended stuff you would have done.

The same idea can be expanded to handle all kinds of states. In Battleheart, I used an array of a custom “touchstate” class that kept track of the starting position, whether the touch was tapped or dragged, and which fingerID it belonged to, etc. By linking your input to bools or ints that you’re changing in an Update() loop, you ensure that nothing is ever “skipped” because of a framerate hiccup or something. If there’s any lag or hiccups, your code won’t break because it’s in sync with the refresh of the screen.

this doesn’t occur to me in builds, only unity remote.

i know this is old topic, but im having trouble whit this, even in the build the touchphase ended isnt working property

how can i detect this ? i have no idea :s

anybody can help me?

Sorry for bad english

Edit: oh i got it! touchcount 0, thanks anyway

http://unity3d.com/support/documentation/ScriptReference/Input-touchCount.html

I had the same problem until i realized my input code was placed in FixedUpdate instead on Update by mistake.

Placing the code in Update seems to have cured the problem of dropped TouchPhase.Ended events (so far).

Yes, FixedUpdate isn’t the place for testing touch events.

I find that Unity is reliable for touch events when multiTouchEnabled is set to false, but unreliable when set to true.

Now I have a var that detects if there was a touch. As it is updated last, I can check the current touch state against the var. Then I know if touches have started or ended.

A little thread-necro here but I suspect that someone (such as myself literally just now) will have this exact same issue, and I have a bit more to add.

I too have been quite successful with TouchPhase.Ended, having no problem with the event being recognized…until today. Scratched my head, searched google and ultimately ending up here.

I was about to fire off some replies for some help when I decided to do a quick check of a boolean I already had in place as per Mika’s advice. Sure enough it was reporting just fine, despite my calling of a function located in another script misbehaving.

In other words a quick Debug.Log(<boolean>) in the Ended phase reported back exactly what it should, yet the function I was firing off was still malfunctioning. A quick tap and it didn’t work, but touching the screen for a bit and sliding my finger around worked perfectly fine every time.

It turned out that I had a while/yield nestled within an if (busydoing) return that was still “busy doing” while TouchPhase.Ended activated; while indeed my code recognized I was done touching the screen, my code was still doing its thing in that coroutine and ignoring the fact I stopped touching the screen and was screwing everything up.

I nuked the busyDoing check and voila, everything is good again.

Obviously your mileage will vary, and this particular problem may not be yours, but double checking the events that occur or are supposed to occur after the TouchPhase.Ended could save you some headaches.

Cheers