Just in case before pointing you into the wrong rabbit hole → the easy cases. If size of render texture matches your game window resolution and Raw Image covers whole window then everything should just work. That doesn’t mean that meaningful content needs to cover whole screen, since you can leave empty parts transaprent. At the cost of wasting little bit of memory and gpu resources to process empty parts, you can avoid most of the complexity with regards mouse input remapping. Shouldn’t be a problem on PC, not sure about mobile platforms. This approach also doesn’t work when the whole reason for render texture is to render at different resolution (for custom pixelart scaling or something like that).
Interestingly seems like things happen to just work even if render texture is smaller than window as long as raw image component is positioned in bottom left corner and scaled so that render texture pixels map 1:1 to actual pixels.
So now back to more difficult case of actually doing the position remapping. First thing that’s good to know is how it all fits together. In following text by Raycaster I will usually mean a class inheriting from UnityEngine.EventSystems.BaseRaycast like the commonly used GraphicRaycaster and PhysicsRaycaster .
when Raycaster is enabled it registers itself to a list containing all active raycasters
a1) when InputModule (StandaloneInputModule form old, new input or Rewired) wants do some stuff with mouse event information it prepares a Pointer event and calls EventSystem.RaycastAll
b1) EventSystem.RaycastAll goes through the list of all active (registered in the list) Raycaster and calls Raycaster.Raycast to create a list of all objects corresponding to the mouse position from previous step
c1) each Raycaster decides how to map the position from pointer event position based on it’s type the canvas and camera it is attached to and stuff like that.
c2) Raycaster then does the actual lookup of objects at corresponding position, in case of PhysicsRaycaster it reuses corresponding physics raycast API, but the canvas one more less just iterates over all the canvas objects.
b2) Once RaycastAll has collected all the results, it then sorts them to decide what should be on top. See EventSystem.
RaycastComparer for more details in case you have the one Canvas blocking mouse events for other Canvas in way you don’t want.
a2) InputModule then takes first raycast result from the sorted list, and sends Pointer Enter,Exit,Up,Down,Drag start,Drag end events to the correct game objects
The part that is relevant for you if you want to remap the position is c1). One option is to your own class which inherits from BaseRaycaster or GraphicsRaycaster more or less from scratch. Reminder that you have access to UGUI source code and can use existing GraphicsRaycaster as reference for most of the stuff it needs to do. On hand there is a lot of stuff going on, on the other hand most of it is to support all the possible ways Unity can be used, and in your specific situation you can probably simplify a lot of it by making assumptions based on your specific setup.
An alternative somewhat hacky, but simpler solution to do something like this
// Warning don't copy this code, it will not work without implementing missing parts
public class RenderTextureRaycastRemapper : GraphicRaycaster
{
// all the stuff you need for remapping mouse position
public RectTransform targetSurface;
public Camera targetCamera;
public Canvas canvas;
public override void Raycast(PointerEventData eventData, List<RaycastResult> resultAppendList)
{
var pos = eventData.position; // save the old mouse position position
eventData.position = RemapPosition(pos); // this is the part you need to implement
eventData.position = pos; // restore old pointer event position so that rest of raycasters don't break
}
}
With all that done there are two more potential pitfalls depends on the complexity of stuff you want to use it for.
First one is correct pointer blocking behavior when interacting with rest of your Canvases. In the simplest case canvas displayed in Raw Image is fully behind or fully on top of rest of the UI elements in Canvas that contains RawImage. Just don’t forget to disable raycast target for Raw Image, if content represented by it is supposed to be behind. Otherwise you will have to look carefully at implementation of RaycastComparer to see which factors you can use to achieve desired order. There is also always an option to threw away some results in your imeplementation of Raycast, by calling Raycast of canvas containing Raw Image to see if it would hit RawImage.
Other potential problem is any UI elements using position of mouse from events they receive. In most cases like PointerEnter/PointerExit,PointerDown,PointerUp the UI widgets don’t really care about position it’s mostly the kind of event that matter. But for stuff like scrollbar and dragging background of scroll rect it might matter and depending on of size and position of your render texture they might act slightly weird. As for any of your custom UI elements which react to mouse events you can always remap the coordinates manually yourself.