I am currently migrating my project’s UI from UGUI to UI toolkit. We’re planning for an incremental migration, so they need to co-exist until the entire UI is migrated. Currently the UGUI camera is rendered in Screen Space - Camera mode using a UI FX Camera. I converted some of the UI to the UI toolkit, but they seem to always be rendered in front of the existing UGUI canvas. And, I want them to be rendered behind the canvas. I tried playing with the Sorting order of the canvas, panel settings asset, UIDocument and none of it helped.
As far as I know the Sort Order settings are compatible with UGUI only for the “Overlay” mode, to not “Camera”.
I believe a workaround might be to render UI Toolkit though a Render Texture first.
Then render it on a quad in the Scene. The event ordering might be tricky to get right though. Do you need the UI Toolkit UI to receive events that aren’t handled by UGUI first ?
For most cases, the UGUI objects would be hidden when I need to interact with the HUD (which is what I’m rendering with the UI Toolkit now). And, when they’re visible, they would be blocking the HUD anyway and there isn’t a need to interact with the HUD. So, I think that won’t be a problem.
And, I’m not sure what you mean by a quad? I’ve only used the Render Texture on a RawImage component, and I’m guessing the RawImage component cannot receive any events. Is there a way to render the UI through a Render Texture and also make it receive all the events?
Sorry, I should have said render it on a “plane”, like the primitive from Unity.
There are 2 main issues with asking UI Toolkit to render to a texture, that you need to somehow handle yourself:
Event processing order
Converting the screen space positions
Event processing order
It all depends on what you need to achieve for this. If UGUI and UI Toolkit never overlap on screen or if they don’t render at the same time, you’re fine.
Otherwise, my conclusion, after spending quite a bit of time looking at the use case of “UI Toolkit behind UGUI when using Screen Space - Camera”, is that there is no way to make it work properly for the general use case.
I think you’d have to find ways to disable the event processing in UI Toolkit temporarily when UGUI is on top (for example by using SetEnabled(false) or adding an overlay with pickingMode=Ignore on top of everything).
Converting the screen space positions
This is necessary to turn points in screen space into a local space for the UI Toolkit panel, to process events.
For example, if the RenderTexture size doesn’t match the size of the UI, or if the RenderTexture is applied to some mesh in the scene.
This is achieved by implementing a function like Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition) and assign it with panelSettings.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);.
Here is an example of an implementation of this function, which assumes you have a some MeshRender setup to display the render texture:
using System;
using UnityEngine;
using UnityEngine.UIElements;
public class PanelOnPlaneHelper : MonoBehaviour
{
public PanelSettings panelSettings;
public Camera targetCamera;
void OnEnable()
{
if (panelSettings != null)
{
panelSettings.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);
}
}
void OnDisable()
{
if (panelSettings != null)
{
panelSettings.SetScreenToPanelSpaceFunction(null);
}
}
private Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition)
{
var invalidPosition = new Vector2(float.NaN, float.NaN);
screenPosition.y = Screen.height - screenPosition.y;
var cameraRay = targetCamera.ScreenPointToRay(screenPosition);
RaycastHit hit;
if (!Physics.Raycast(cameraRay, out hit))
{
return invalidPosition;
}
var targetTexture = panelSettings.targetTexture;
MeshRenderer rend = hit.transform.GetComponent<MeshRenderer>();
if (rend == null || rend.sharedMaterial.mainTexture != targetTexture)
{
return invalidPosition;
}
Vector2 pixelUV = hit.textureCoord;
//since y screen coordinates are usually inverted, we need to flip them
pixelUV.y = 1 - pixelUV.y;
pixelUV.x *= targetTexture.width;
pixelUV.y *= targetTexture.height;
return pixelUV;
}
}
Hope this helps.
I am working on a more general sample to explain this with assets, a Scene, etc.
Hi, thanks for this. But, I felt this was too complex for a temporary workaround. So, I ended up switching the UI toolkit UI to a kinda disabled mode by rendering them on a RawImage whenever UGUI screens need to be overlayed over the UI toolkit panel. But, unfortunately it has it’s own set of problems.
So, I decided to switch the UGUI canvas to ScreenSpace - Overlay mode and migrate the problematic areas first. The sorting order works as expected, but the UGUI objects don’t receive any events at all. This is the setup I have so far
Even when a UGUI panel with sort order 4 is visible (covering the entire screen), it does not receive any events and the UI toolkit panel beneath them receives the events. Is there a way to get the events to work properly?
If not, what’s the preferred way to do the UI migration? Is there no way for UGUI and UI Toolkit to co-exist?
If you can confirm your event set up matches what is described in this page under "UI Toolkit elements and uGUI components with …" then it looks like a bug.
Possibly UI Toolkit panels are blocking events for UGUI panels below them because they have a fully transparent root taking the whole screen. This should be set as pickingMode=Ignore.
But that wouldn’t explain why the UGUI Canvas on top wouldn’t get events.
Hey, turns out my child canvases were missing a GraphicRaycaster component. I didn’t know each individual canvas required a separate GraphicRaycaster component. It’s working fine after I added it. Thanks for the help
Hi, I’ve run into another problem unfortunately. The system works completely fine in the Editor. But, the UI Toolkit elements do not receive any events at all in the standalone build. I noticed that the PanelSettings game object with the PanelEventHandler and the PanelRaycaster components is not created at start in the standalone build. It is only getting created if I disable and re-enable the UIDocument after startup. How can I make the PanelSettings game object to always be created in the startup? Or, is it possible to manually create them and just keep them in the scene?
If each world space UI toolkit needs its own Render texture and GraphicRaycaster component isnt that going to cause performance issues fairly fast? Would it be beter to use UI tool kit for static overlay HUD while using UGUI in world space behind the HUD? Is the performance the same ewither way? It seems like world space and z-index need to be worked out befor the new UI toolkit can realy replace UGUI.
Hi, was this ever done? I’m looking for something similar: I’d like to create sprite animations on a canvas in front of a fullscreen UIDocument. The animations don’t need to be interactable, so I’m not worried about the canvas receiving events. My knowledge of render textures and the like is quite limited, so a full-blown example would be fantastic. Thanks
I came across this thread when having problems getting a Canvas and UIDocument on a component to work properly. Thanks to this post , I discovered that the sort order on the UIDocument component is not acting on the overlay sort order, maybe just the sort order of UIDocuments within the same component. The setting that will interact with the canvas is in sort order in Panel Settings. The Unity documentation indicates that sort order is a factor, but doesn’t make it clear where the setting is.
RenderTexture only works satisfactory if you have no UI elements following a world position, otherwise it will create a lag that will be very noticeable sadly.