Raycast to plane of rendertexture (solved)

Hi.

I have a tricky problem:
I have a main plane with a rendertexture on it. The rendertexture shows a plane with a button which is far away from the main plane.
Now if i do a raycast from the render texture camera with the mouse screen coordinates, my button script won’t react.

Code for the raycast:

Physics.Raycast(m_RenderTextureCam.ScreenPointToRay(Input.mousePosition), out hit, 100);

Anyone a hint? Thx for help.

How does your button code look like
How does your routine look like that calculates the virtual coordinates basing on the RT?
How far away is the button? if the raycasts reason is to hit it physically, ensure that you cast the ray far enough to even be able to hit it.

The button script works if I put it one the main plane and make the raycast from the main camera, so the screen coords and the script are correct.

Well… I don’t calculate them. I pass them over to the camera of the render texture, as shown in the code.

Both main cam and render texture cam are at the same height, as well as the distance of the button to the camera.

I have attached a small picture that explaines my problem better.

104646--4012--$problem2_145.jpg

Hm… seem’s to be a more complex problem :slight_smile:

The main problem seems to be the calculation of the mouse position:

The main camera is at the world position 0,0,0 vector.
the render camera is at the world position 20,0,0 vector.

Now how can I calculate this difference to the mouse position? How many pixels is a 1.0 of a vector?

Someone a hint? :slight_smile:

Does the button have a collider of some type associated with it, so that the ray has something to hit?

Well yes. As I mentioned before, the button works fine when I use him on the main plane. It is the same button (dublicated). :slight_smile:

edit:
I think the problem is the calculation of the mouse input. If anyone knows how to calculate the distance of the two vectors to pixel coordinates, I would be very thankful :smile:

How are you drawing your button: Unity GUI or your own code?

Is the MainPlane is a flat piece of geometry with a MainPlainTexture associated with it, and you are copying the RenderTexture into the MainPlainTexture?

  • OR -

Is the RenderTexture on a separate piece if geometry that floats above the MainPlane?

  • OR -

Is the RenderTexture being used to texture the whole MainPlain?

It seems your wanting to render your UI to a separate texture so you can manipulate it in cools ways, but you still want the user to be able to interact with the UI as if it where being drawn directly to the display, Is that correct?

I apologize if the questions overlap or you feel you’ve answered them already, I’m just trying to get a better idea of the problem, before I attempt a solution. Because I’m looking at it and thinking, “Well, depending on what he wants to do, it’s either a simple solution or a kinda’ tricky one” … :wink:

Unity GUI. I used the editor to drag the buttons into the scene. Both buttons work well, if they are on the main plane. Of course the button moved to the render texture won’t work anymore, as he doesn’t get the raycast :frowning:

The main plane has a own background texture. Above the main plane, there is another plane with the render texture on it. The main plane should be unchanged, as the real game will be shown on the render texture.

Excactly! :smile: The user can click what he can see, but manipulates the scene which is far away.

No problem, I meant no offense (sry if it sounded so)! The problem seems to be a tricky one, if not you would spare me a hard work around :slight_smile:

Ok, I solved the problem myself :slight_smile:

For those who come to the same problem:

The trick was not to change the mouse coords, instead a second ray was made and its origin vector was translated by the delta distance.
With the second ray, I made a second raycast and looked if it hit something (this time on the render texture plane).

Congrats! :slight_smile:

I’m facing a similar problem. You say you translate the second ray by the delta distance. What is the delta distance, from what to what?

@ianjosephfischer if the first raycast hit the screen object than you get the hit.textureCoord and pass it as a parameter on the renderTexture camera’s method ViewportPointToRay

ray = RenderTextureCamera.ViewportPointToRay(hit.textureCoord);
Physics.Raycast(ray, out hit, Mathf.Infinity);

hit.textureCoord only return a proper value when it hits a meshcollider. In my test scene I have the rendertexture applied to a plane with a meshcollider with a quad mesh.