Unity 5 multitouch input on Windows 7 touchscreen

I’m building a UI-heavy app in Unity 5. I was under the impression that multitouch for Unity UI would just work out of the box, but that doesn’t seem to be the case. If I have one touch already, subsequent touches don’t seem to register on the touchscreen. (Note - I am testing a build on a target device, not in-editor)

Am I missing something simple? I tried disabling the Standalone Input Module and setting the Touch Input Module to allow activations on Standalone, but that didn’t seem to help.

Unfortunately touch screen is not yet supported by UI on Stanalone.

OK. So, what is the purpose of the “Allow activation on Standalone” flag on the Touch Input Module? That sounded like it was designed for this situation.

Also, is this something that could be easily made to work by modifying the open source UI code? Or is it a lot more involved than I think it is?