Render texture blocking raycast from camera

In my project, I have a render texture that is a raw image on another camera to scale the game’s resolution. The problem is that the raycast no longer works from the main camera. I have tried ignoring the UI layer, however does not seem to work.

My quick solution to this problem was as such:

// Store RenderTexture temporarily.
var tempTex = camera.targetTexture;

// Stop camera from rendering to a texture.
camera.targetTexture = null;

// Raycast here;

// Put texture back on camera.
camera.targetTexture = tempTex;

For anyone who runs into this problem, the fix was to add a layer mask to the raycast and put the render texture on it.

Hi! I am super noob at unity/coding in general, but I have found another solution. For me, it looks like the problem was that the Raw Image didn’t allow the rays to be cast from the camera (Maybe a bug? I don’t know).

  1. I have created a duplicate of the
    main camera and set it as child of
    the main one. Then I have set the
    target texture ONLY on the main
    camera.

  2. I have then edited the raycast
    script to have the child camera to
    be the one that will cast the
    rays.

Everything now worked perfectly!
I know the post is a bit old but I imagine a lot of people (including me until a few minutes ago) are still looking for an answer.

An alternative quick solution:
(Seems like it’s a coordinate problem when the camera has a target RenderTexture assigned)

var renderTexture = Game.renderTexture;
var mousePos = Input.mousePosition;
// Divide by screen size and multiply by render texture size
mousePos = mousePos / new Vector2(Screen.width, Screen.height) * new Vector2(renderTexture.width, renderTexture.height);

var ray = Game.mainCamera.ScreenPointToRay(mousePos);
RaycastHit hit;
bool didHit =  Physics.Raycast(ray, out hit, float.PositiveInfinity);