Use UI that is displayed only by a RenderTexture

Hello Community,

due to need of some special effects, I’d like to render and use UGUI elements onto a render texture and then display that render texture at some other location on the screen (potentially with scaling).
For that I’ve added:

  • An additional camera (Cam2) which renders to the render texture and this camera is the target of the Canvas which holds the relevant UI. The Canvas has a Graphic Raycaster.
  • An additional Canvas has some Rect Transform chain and finally a Raw Image component which displays the RenderTexture. That canvas targets the main camera (Cam1) and it also has a Graphic Raycaster.

The rendering part works flawlessly.

The user input aka mouse however does pactically not work at all. Really strangely, I get Hover-Events on some of the UI elements when I move the mouse up to the menu bar of the Unity editor (completely outside the game window).

How does one do this? How can I tell the Graphic Raycaster how to transform the mouse input correctly (it seems like it (understandable) does not know where the canvas through the Render Texture will be displayed)?

Huge thanks in advance!

You will probably have to make a customized raycaster (that matches interface excpected by unity) which remaps the position to take into account the extra information you have. I recently did that for noncanvas objects displayed using rendered texture, but the overall idea should be similar. Raycaster do quite a bit of stuff , so in my case it was easier to instead of fully writing a custom raycaster form scratch, to just override Raycast method and temporary modify position of mouse event before calling base implementation. But that was for Physics2DRaycaster, would have to read the source code of GraphicRaycaster again to check what’s simpler for that and whether it can be tricked into desired remapping without modifying it. I can probably give you more details a bit later.

I have seen a few simpler solutions that people sometimes use for world objects and simple computers monitors displayed in 3d world by manually calling the raycaster Raycast or RaycastAll methods but that usually works basic clicks at most. To properly handle all the mouse interactions with different UGUI widgets is difficult unless you make the customized raycaster and let the Unity do rest of work.

Just in case before pointing you into the wrong rabbit hole → the easy cases. If size of render texture matches your game window resolution and Raw Image covers whole window then everything should just work. That doesn’t mean that meaningful content needs to cover whole screen, since you can leave empty parts transaprent. At the cost of wasting little bit of memory and gpu resources to process empty parts, you can avoid most of the complexity with regards mouse input remapping. Shouldn’t be a problem on PC, not sure about mobile platforms. This approach also doesn’t work when the whole reason for render texture is to render at different resolution (for custom pixelart scaling or something like that).

Interestingly seems like things happen to just work even if render texture is smaller than window as long as raw image component is positioned in bottom left corner and scaled so that render texture pixels map 1:1 to actual pixels.

So now back to more difficult case of actually doing the position remapping. First thing that’s good to know is how it all fits together. In following text by Raycaster I will usually mean a class inheriting from UnityEngine.EventSystems.BaseRaycast like the commonly used GraphicRaycaster and PhysicsRaycaster .

when Raycaster is enabled it registers itself to a list containing all active raycasters

a1) when InputModule (StandaloneInputModule form old, new input or Rewired) wants do some stuff with mouse event information it prepares a Pointer event and calls EventSystem.RaycastAll
b1) EventSystem.RaycastAll goes through the list of all active (registered in the list) Raycaster and calls Raycaster.Raycast to create a list of all objects corresponding to the mouse position from previous step
c1) each Raycaster decides how to map the position from pointer event position based on it’s type the canvas and camera it is attached to and stuff like that.
c2) Raycaster then does the actual lookup of objects at corresponding position, in case of PhysicsRaycaster it reuses corresponding physics raycast API, but the canvas one more less just iterates over all the canvas objects.
b2) Once RaycastAll has collected all the results, it then sorts them to decide what should be on top. See EventSystem.
RaycastComparer for more details in case you have the one Canvas blocking mouse events for other Canvas in way you don’t want.
a2) InputModule then takes first raycast result from the sorted list, and sends Pointer Enter,Exit,Up,Down,Drag start,Drag end events to the correct game objects

The part that is relevant for you if you want to remap the position is c1). One option is to your own class which inherits from BaseRaycaster or GraphicsRaycaster more or less from scratch. Reminder that you have access to UGUI source code and can use existing GraphicsRaycaster as reference for most of the stuff it needs to do. On hand there is a lot of stuff going on, on the other hand most of it is to support all the possible ways Unity can be used, and in your specific situation you can probably simplify a lot of it by making assumptions based on your specific setup.

An alternative somewhat hacky, but simpler solution to do something like this

// Warning don't copy this code, it will not work without implementing missing parts
public class RenderTextureRaycastRemapper : GraphicRaycaster
{
     // all the stuff you need for remapping mouse position
    public RectTransform targetSurface;
    public Camera targetCamera;
    public Canvas canvas;

    public override void Raycast(PointerEventData eventData, List<RaycastResult> resultAppendList)
    {
        var pos = eventData.position; // save the old mouse position position
        eventData.position = RemapPosition(pos); // this is the part you need to implement
        eventData.position = pos; // restore old pointer event position so that rest of raycasters don't break
    }
}

With all that done there are two more potential pitfalls depends on the complexity of stuff you want to use it for.

First one is correct pointer blocking behavior when interacting with rest of your Canvases. In the simplest case canvas displayed in Raw Image is fully behind or fully on top of rest of the UI elements in Canvas that contains RawImage. Just don’t forget to disable raycast target for Raw Image, if content represented by it is supposed to be behind. Otherwise you will have to look carefully at implementation of RaycastComparer to see which factors you can use to achieve desired order. There is also always an option to threw away some results in your imeplementation of Raycast, by calling Raycast of canvas containing Raw Image to see if it would hit RawImage.

Other potential problem is any UI elements using position of mouse from events they receive. In most cases like PointerEnter/PointerExit,PointerDown,PointerUp the UI widgets don’t really care about position it’s mostly the kind of event that matter. But for stuff like scrollbar and dragging background of scroll rect it might matter and depending on of size and position of your render texture they might act slightly weird. As for any of your custom UI elements which react to mouse events you can always remap the coordinates manually yourself.

1 Like