Multiple touchscreens: multitouch on multiple input devices

Hi! First of all I tried putting this in the Unity Answers, however since I think it’s something related to stuff that’s still in preview, and Answers is more about user-to-user help, I thought I could get more help here.

Here is the question just for reference, though I just copy-pasted everything here:
https://answers.unity.com/questions/1704768/multiple-touchscreens-multitouch-on-multiple-input.html

Anyway…

I have been trying to get input from two touchscreens at the same time, and detect from which is coming the input in order to manage UI elements from two separate Canvases on two different displays, but it doesn’t properly work. Platform is Windows.

Basic premise of the application, it’s an interactive app which is going to be used simultaneously on both screens by kids, so it’s kind of necessary that both screens can act simultaneously and separately. I don’t really care about multitouch, just that the screens respond properly and one doesn’t block the other (which is what happens if you simply use the UI, like, say, a vanilla UI Button component). It feels a bit unresponsive.

I’m using the new Input System. I first tried 0.2 since I was using Unity 2018.4 (it’s the latest LTS available so it’s more reliable). But I also tried with 1.0 which comes with Unity 2019.3. And overall same results in both.

Here’s the code from my test right now. This one comes from my tests with InputSystem 1.0.0preview

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.InputSystem;
using UnityEngine.InputSystem.Controls;
using UnityEngine.UI;


public class TestTouch : MonoBehaviour {
    [SerializeField] int deviceId;
    [SerializeField] Text debugText;

    Image selfImage;
    Text label;
    RectTransform rt;
    //Touchscreen selfDevice;

    bool pressed;

    void Start() {
        selfImage = GetComponent<Image>();
        label = GetComponentInChildren<Text>();
        rt = (RectTransform)transform;

        InputDevice[] allDevices = Touchscreen.all.ToArray();

        string devicesStr = "";
        foreach(InputDevice d in allDevices) {
            devicesStr += d.name + "::" + d.displayName + "::" + d.shortDisplayName + System.Environment.NewLine;
        }
        debugText.text = devicesStr;

        /*Touchscreen selfDevice = Touchscreen.current;
        selfDevice.activeTouches*/
    }

    void Update() {
        // Offsets for positions.
        float startOffset = 1920f * deviceId;
        float endOffset = startOffset + 1920f;
        // Check touch devices.
        if (Touchscreen.current != null) {
            foreach (TouchControl tc in Touchscreen.current.touches) {
                // Get value
                Vector2 touchPos = tc.position.ReadValue();
                //debugText.text = touchPos.ToString();
                // Check if it's for this screen
                if (touchPos.x >= startOffset && touchPos.x < endOffset) {
                    touchPos.x -= startOffset;
                    RefreshPressed(true, touchPos);
                    return;
                }
            }
        }
        bool mp = false;
        Vector2 pos = Input.mousePosition;
        if (pos.x >= startOffset && pos.x < endOffset) {
            mp = Input.GetMouseButton(0);
        }
        RefreshPressed(mp, Input.mousePosition);
    }

    void RefreshPressed(bool press, Vector2 position) {
        pressed = false;
        if (!press) {
            selfImage.color = Color.white;
            return;
        }
        //
        Rect r = rt.rect;
        Vector2 start = rt.position + new Vector3(r.x, r.y);
        Vector2 end = start + new Vector2(r.width, r.height);
        if (position.x > start.x && position.x < end.x
            && position.y > start.y && position.y < end.y) {
            pressed = true;
        }
        selfImage.color = pressed ? Color.gray : Color.white;
        label.text = position.ToString();
    }
}

Now, thing is, as far as I can see, UI uses the default mouse abstraction (which I think is managed by Windows?), so any touch is interpreted as a mouse. It lets me interact with both canvases either with mouse or touch, but not at the same time.

I also tried printing all available InputDevices by iterating over Touchscreen.all and it only shows one Touchscreen. I don’t know if it’s because Windows abstracts it or what, but thing is, only one screen seems to get touch support. I’m trying the new InputSystem because I read that was one of the things that was being developed, better touch support.

If you help me, I’ll give you a virtual cookie. Thanks!

Did you find a solution?