Previously it was possible to add Unity XR platform support for mixed reality, so that your application could be launched within other MR applications and the interaction with the UI would be based on the XR input system e.g. wands.
Now when launching desktop application built with Unity, it can not be built without adding XR support (as the interaction was recently moved to the XR package). Yet this behavior now FORCE the application to run within the Mixed Reality portal, even if user would just like to run the application without MR.
What can be done to get back the old functionality? Support MR use, yet allow also running application without MR.