Is it possible to render UI toolkit elements behind a UGUI canvas that is in screen space camera?

Sorry, I should have said render it on a “plane”, like the primitive from Unity.

There are 2 main issues with asking UI Toolkit to render to a texture, that you need to somehow handle yourself:

  • Event processing order
  • Converting the screen space positions

Event processing order

It all depends on what you need to achieve for this. If UGUI and UI Toolkit never overlap on screen or if they don’t render at the same time, you’re fine.

Otherwise, my conclusion, after spending quite a bit of time looking at the use case of “UI Toolkit behind UGUI when using Screen Space - Camera”, is that there is no way to make it work properly for the general use case.
I think you’d have to find ways to disable the event processing in UI Toolkit temporarily when UGUI is on top (for example by using SetEnabled(false) or adding an overlay with pickingMode=Ignore on top of everything).

Converting the screen space positions

This is necessary to turn points in screen space into a local space for the UI Toolkit panel, to process events.
For example, if the RenderTexture size doesn’t match the size of the UI, or if the RenderTexture is applied to some mesh in the scene.

This is achieved by implementing a function like Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition) and assign it with panelSettings.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);.

Here is an example of an implementation of this function, which assumes you have a some MeshRender setup to display the render texture:

using System;
using UnityEngine;
using UnityEngine.UIElements;

public class PanelOnPlaneHelper : MonoBehaviour
{
    public PanelSettings panelSettings;
    public Camera targetCamera;
  
    void OnEnable()
    {
        if (panelSettings != null)
        {
            panelSettings.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);
        }
    }

    void OnDisable()
    {
        if (panelSettings != null)
        {
            panelSettings.SetScreenToPanelSpaceFunction(null);
        }
    }

    private Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition)
    {
        var invalidPosition = new Vector2(float.NaN, float.NaN);

        screenPosition.y = Screen.height - screenPosition.y;
        var cameraRay = targetCamera.ScreenPointToRay(screenPosition);

        RaycastHit hit;
        if (!Physics.Raycast(cameraRay, out hit))
        {
            return invalidPosition;
        }

        var targetTexture = panelSettings.targetTexture;
        MeshRenderer rend = hit.transform.GetComponent<MeshRenderer>();

        if (rend == null || rend.sharedMaterial.mainTexture != targetTexture)
        {
            return invalidPosition;
        }

        Vector2 pixelUV = hit.textureCoord;

        //since y screen coordinates are usually inverted, we need to flip them
        pixelUV.y = 1 - pixelUV.y;

        pixelUV.x *= targetTexture.width;
        pixelUV.y *= targetTexture.height;

        return pixelUV;
    }
}

Hope this helps.

I am working on a more general sample to explain this with assets, a Scene, etc.

5 Likes