We are creating multiple installations with multiple touchscreens attached to a single Windows machine, and so far we have no luck with having multiple displays when using Unity’s input systems.
By reading Managing multiple touchscreens setup it shows there is no luck with the new input system, and the old input system doesn’t handle multiple touchscreens not well either.
What would be a good route to develop this ourselves, using a native plugin written by ourselves? Would a system creating Touchscreen state structs and using InputSystem.QueueStateEvent(...) to feed the event buffer still be the best case?
I’m almost ready doing this with TouchScript and a custom native backend (it needs some work in the editor, but runtime we have multi touch with multi displays), but InputSystem support would be nice!