Hey everyone!
I am somewhat new to Unity and I am currently struggling with the implementation of touch screen controls for my camera which already is working fine with mouse inputs.
To test what is going on, I wrote a very simple script and attached it to my camera:
using UnityEngine;
using System.Collections;
public class TouchinputTest : MonoBehaviour
{
void Start ()
{
if(Input.multiTouchEnabled)
{
// this happens in the editor, but not in the build?!
camera.backgroundColor = Color.magenta;
Debug.Log("No Touchscreen Device");
}
}
void Update()
{
int fingerCount = 0;
foreach (Touch touch in Input.touches)
{
if (touch.phase != TouchPhase.Ended
&& touch.phase != TouchPhase.Canceled)
{
fingerCount++;
}
}
if (fingerCount > 0)
{
//this never happens
camera.backgroundColor = Color.red;
}
if (Input.GetMouseButtonDown (0))
{
//this works fine
camera.backgroundColor = Color.green;
}
if (Input.GetMouseButtonDown (1))
{
//this works fine as well
camera.backgroundColor = Color.blue;
}
}
}
So when I run the above code in the Unity Editor, the backdrop gets magenta, even though my development machine is a Windows 8 laptop that does not have a touch screen (but has a multi touch capable touch pad if that matters). When I run the code in the actual release (or development ) build, it never gets purple, not even a test Windows 8 device with (multi- ) touch screen.
The color change in line 30 never happens on any of my two test devices, neither in the Editor nor the build.
I added a color change for mouse button presses for a sanity check and that seems to work fine. Most interestingly, this mouse input color change seems to also get executed when I touch the screen on the touch screen laptop.
But even though it seems to interpret the touch as a button press, touch- drag does not seem to work in the actual camera control code.
Even more interestingly the default(?) pinch multi touch behavior for the zoom seems to be working somehow (I guess it somehow gets mapped to the mouse wheel?).
Anyway, all this is super bizarre and I cant make any sense of it. Could it be that the touch device on my Windows 8 test machine is not configured correctly? Am I missing something somewhere? Is there a setting for build, input, etc that needs to be set?
Thanks in advance for your help!
Thanks for the quick reply! Unfortunately your suggestion did not change anything. It still only turns green, when I touch the screen (as if this was a mouse input and not a touch input).
What happens if you remove everything but the two mouse button checks? I believe you should get a blue screen with two touches when using mouse inputs on a touch screen, simulating a right-click.
The code works fine when on my tests. The screen does go magenta in the editor, but the touch is detected on my Android device (red screen). If I do as I suggested above I get a green screen with one touch and a blue screen with two. I also get a magenta screen on startup.
Hmm, interesting. Maybe it just does not work on Windows8 touch screens? I can never get a red screen, when I touch the Windows 8 laptop!
Removing everything but the mouse inputs would not help. I started implementing the touch controls, because my mouse controls would not work with a touch screen. Somehow, this is leading me around in circles and I am starting to wonder whether there is something fundamental that I am missing.
I can get left and right click Mouse Button presses from the touch screen (blue and green screen changes). Right click works when I touch the screen and keep holding it for a few seconds. I can even do a pinch which somehow is interpreted as a scroll wheel movement that I set up for my camera zoom. But what does not work is left click plus drag. I get the “LMB press” to register when I touch the screen, but the drag does not seem to do anything. That is why I tried to implement real touch controls, but they don’t seem to work either, because the device does not even seem to register as a touch screen (even though it clearly is).
Hmmm, if that was the case, then I would be able to control my camera with the touch screen just fine with the very code I have been using for mouse controls, but it does not work
Alright, so I got it to work! The correct answer is:You have to build and deploy it as a Windows Store App.
Then it works with the touch screen on the Windows 8 laptop. That whole process of building for Windows Store is a major pain, to say the least! It makes development (and deployment of limited, personal releases) a very slow and painful process. So if there was a way to make touch work with Windows Desktop applications, I would be much happier. Anyone here, who can add something to this?
Thanks!
So, another update. I am happy to report that I got the touch inputs to work with Valentin Simonov’s awesome (and free) TouchScript
Much better than the native capabilities and it works with desktop applications for Windows 7 and Windows 8 (and all other devices).
So for everyone who discovers this thread whilst looking for a solution for the same problem, this is the best solution, I have found so far.
Thanks to everyone who tried to help! I think this thread can be closed now.