Hi guys,
In Short: I want to use touch events on a world canvas independent of the viewing camera.
More Details:
- On a Table in the virtual room lies a virtual tablet (WorldSpace Canvas).
- The user can control it through a real tablet. (Kind of working)
- The touch inputs are sent from a tablet to the desktop-pc and piped into the unity input system via a custom touch input module which then controls Unity UI. This works fine.
- Also the touch visualization of the touches on the canvas ( works fine).
The main problem is, that the touches are mapped to the canvas using the main camera as reference system => If the button is being rendered in the corner, I also have to touch in the corner, even if it actually is in the middle of the canvas.
How can I map the touch coordinates directly to the canvas, independent of the camera?
I already tried adding a second camera tracked at the canvas, it works, but as soon as I build my project, the second camera seems to be ignored, wich results, once again, in wrong touch mappings.
Thanks for your help!
I am very curious about your solutions and ideas