Many indie games built with Unity are broken on Windows Touch screen devices (MS Surface, HP x360, Lenovo Yoga, etc). In these games touch events (taps, drags, etc) produce no response at all, not even emulating the equivalent mouse events. While many games do require hover events or both left and right clicking, quite a few (>50%) would be fully playable if taps (and drags) simply produced the equivalent left-mouse-button events. But somehow Unity prevents this sensible default behavior. Now, a professional team would have the resources to test on touch screen devices and even write new control schemes to properly support touch input. But the indie developers need sensible defaults because they are unlikely to even have access to a touch screen.
Indeed, for some versions of Unity touch “Just Works”, but I think perhaps the progress has actually been in the wrong direction, with the new input system having the wrong defaults that break touch! In my own game, using legacy input, UI elements work fine with touch, receiving the appropriate “click” events. But I can’t really be sure as I haven’t been able to reproduce this bug in my own code, even though tons of games on steam have this issue. Looking at the build files, all of the games with this problem all use Unity.InputSystem.dll (the version doesn’t matter, but one example would be 1.7.0). The games that work properly (and my game as well) use UnityEngine.InputModule.dll and UnityEngine.InputLegacyModule.dll
Best of all is if Unity themselves would fix this and release a new DLL. But lacking that, if there’s a simple configuration change or a few lines of code needed to properly configure the input system on startup, many, many games could be fixed. In my free time as a gamer, I would say about 30% of unity games on steam are broken in this way.