Desktop and VR Event System?

Hey! So I’m working on a VR project that also needs a second user to interact with UI elements on the 2D monitor.
Everything works fine in VR (I followed this tutorial), but then when I add a Screen Space/Overlay canvas for desktop UI elements, and a new default event system, my desktop isn’t taking input from my mouse.

I know this has to do with the fact that I’m using two event systems (the desktop works if I disable the VR pointer), but I need both to be running/working at the same time. Is this possible? Is there a better way to do VR Pointer input so it will work?
I can’t find any info about this, which is strange because I know there are games that have similar functionality. I already have a ton of UI (buttons, sliders, etc) in VR, so I would really rather not have to switch them all out for something new.

Thanks!

This is old but I found a solution that may help others:

Right after posting My comment on the OP I had a thought and started looking at what was on my Event system and what happened when the scene was started without an event system in place. It turns out something in the XR interaction tool kit 2.0.0 (in Unity 2020.3.29 at least) or another script in the scene was generating an event system that was conflicting with one made when the canvas was created.

The event system generated by a GUI used an “InputSystemUIInputModule” script on it while the generated input system uses “XRUIInputModule” noting this I was able to change my event system to use the proper module and now when I open the scene it doesn’t generate a second conflicting system and I can place the event system in the scene without relying on unity to generate one.