Hi all. In the Linux editor, my game works perfectly. Once I build for 64 bit or universal, the build doesn’t see to register touch screen events. When I touch the screen, a keyboard comes up instead.
Did I miss a build flag somewhere?
Thank you in advance,
I guess you should provide some more information like for example how do you handle touch events, it works for me and I’m using mouse-up/down conditionals in the update function I’ve successfully built an android app and it works without issues. What version of the editor are you using. I’ve also built the universal linux version and it works as well so…
I’m using the 2018.1.0b8 Linux build on a Fedora 27 OS.
I inherited this game, and I’m still wading through the code, but found this comment in the shot manager code:
//looks for input on movile or standalone/editor
void checkInput()
{
#if UNITY_ANDROID || UNITY_IOS
if (Input.touchCount>0)
{
Ray ray = c.ScreenPointToRay(Input.touches[0].position);
RaycastHit hit = new RaycastHit();
if (Physics.Raycast(ray,out hit, 1000))
{
Debug.DrawLine(cpos, hit.point);
// cache oneSpawn object in spawnPt, if not cached yet
GameObject projectile = Instantiate(shotPrefab, new Vector3(cpos.x, cpos.y, cpos.z), Quaternion.identity);
// turn the projectile to hit.point
projectile.transform.LookAt(hit.point);
// accelerate it
projectile.GetComponent<Rigidbody>().velocity = projectile.transform.forward * 50;
}
}
#endif
#if UNITY_STANDALONE || UNITY_EDITOR
if (Input.GetMouseButtonDown(0))
{
Ray ray = c.ScreenPointToRay(Input.mousePosition);
RaycastHit hit = new RaycastHit();
if (Physics.Raycast(ray,out hit, 1000))
{
Debug.DrawLine(cpos, hit.point);
// cache oneSpawn object in spawnPt, if not cached yet
GameObject projectile = Instantiate(shotPrefab, new Vector3(cpos.x, cpos.y, cpos.z+1f), Quaternion.identity);
// turn the projectile to hit.point
projectile.transform.LookAt(hit.point);
// accelerate it
projectile.GetComponent<Rigidbody>().velocity = projectile.transform.forward * 50;
}
#endif
}
}
This looks like the cause of my problem. I was not expecting it to be intentional.
What is the proper build flag for the unity linux variant? What’s the proper way to activate this for a linux build?
Thanks again,
Gary
If I were you I’d comment out the android/ios part for now and remove all compiler directives and try if it works or not. Just make a backup. Besides 90% of the code is repeated so… I’d wrap only the if statement or create a flag im doing it like this its easier to debug
Thanks for refactoring tips. I will take care of inefficiencies later.
Can you please answer the question: What are the specific LINUX build precompile directives that I can use going forward, here or elsewhere in code? Is there a UNITY_LINUX precompile directive?
I guess in your case you don’t need a specific directive just simple #else would be enough, I don’t know if there is one… Btw my code is like a total work in progress these calls are probably random stuff from stackoverflow or unity forums, I got the “rotateAround” call from the official docs tho.
I believe that touch input is not supported in the Linux player.
I think this may be coming (at least mouse emulation for touch input) in Unity 2018.2.
I am using Unity 2018.3.0 with a eGalaxTouch in Linux and just like the original request stated I can’t seem to be able to get touch events in the build that is running in Ubuntu. Did I miss anything ?
@Schubkraft What is the current status of touch support in Linux?
Similar to dilmer, we use a eGalaxTouch display with Linux (Linux Mint 17). We would like to port our software from Unity 5.5.0 to Unity 2018.3 but can’t do so at the moment because dragging does not work at all with the touch display. We do not actually need touch events but instead we would like the touch display to behave exactly the same as a mouse. However, as stated before dragging (press & hold) does not properly work.
I am also suffering from the same issue. Pressed (default button script) works, however touch (as click) is not working properly though. Surprisingly, while I am moving the mouse and touching on the screen at the same time, touch (as clicks) event works.
It seems we are suffering from the same issue. Our (2017.4 LTS) apps are broken on Linux on Latitude 3340 with ELAN touchscreen, I suspect because of the above.
I’ve tested that the trackpad and the touch behave differently WRT to X11 events using xev, and while clicks work, any kind of dragging doesn’t. This seems very similar to @erengokgur ’ s description.
I’m also suffering from this problem, the company i work for sells touch screen tables for kids with educational games that currently uses windows 8 iot, we want to change the OS to linux but most of our partners are developing games with unity 2018.3+, will this be fixed in unity 2019.3 or is it an 2020+ issue for unity?
Hey again, I looked into where this got stuck. We unplugged the pipe and it is now moving forward again. Will keep you updated as to when it will happen. Sorry (yet again) for the delay.