testing touch function on pc instead of on phone

I have a problem. I’m not able to connect my phone to my PC and then run the game in the editor and test it live on my phone and some people are able to.
I have to build it, install it on my phone and run it then. I’m currently working with Input.touch and I wonder. Is there a method to make the mouse act as a finger? So I can be able to touch the “screen” with the mouse somehow and be able to test my game in the editor which is much quicker?

There’s not any obvious way to do it short of plugins; however if it does help Input.GetMouseButtonDown(0), Input.GetMouseButton(0), and Input.GetMouseButtonUp(0) register taps the otherway around.

I tend to write my touchscripts with a mouse version afterwards in order to test stuff. Sorry that its not the answer you’d like to hear. :frowning:

can you “Build and run” directly from unity?
if not, make sure your device have the correct driver in device manager.
(assuming you got android + windows). after that your should use remote app on your device but it’s not very good and the best option is always test the app on real device.
for simulating screen touch with mouse i’m using easyTouch plugin. it’s not free but work really good.

I’ve found there are enough differences device vs. PC that you have to test a real build on the device anyway. For examples, accounting for the finger being fatter than the mouse and hiding more of the screen; accidental 2nd fingers; general “feel” of an often slower framerate.

You can do builds from Unity and have them run, but it takes up to a minute or more (depending on the current project size) to compile, copy to your test phone, display the unity logo and actually start playing. Messing with complex touch interactivity involves hundreds of these kinds of tests to make sure that your touch logic is right. That is a lot of time staring at a progress bar.

PC developers would definitely use touch displays to develop the touch components on if the option was available. Being able to instantly test touch functionality while in play mode would dramatically cut down on development time for complex touch interactivity - primarily multi-touch programming.

As it is, all development is done on PC and Apple machines, but a huge amount of final product is touch based. Unless there is some complication that prevents getting touch events in play mode from windows (and macs if Apple ever allows a OSX touch enabled monitor) - it would definitely be embraced by developers. Developing touch apps without a play mode to test code in is a real problem - one that should be relatively easy to fix. The only reason I can see for not is because it currently is PC specific - and Unity may not like the idea of giving the PC a function the mac doesn’t have.

We aren’t looking for a perfect touch play mode here, but having it ignore touches completely seems like a giant unnecessary oversight.