I have a tricky problem:
I have a main plane with a rendertexture on it. The rendertexture shows a plane with a button which is far away from the main plane.
Now if i do a raycast from the render texture camera with the mouse screen coordinates, my button script won’t react.
Code for the raycast:
Physics.Raycast(m_RenderTextureCam.ScreenPointToRay(Input.mousePosition), out hit, 100);
How does your button code look like
How does your routine look like that calculates the virtual coordinates basing on the RT?
How far away is the button? if the raycasts reason is to hit it physically, ensure that you cast the ray far enough to even be able to hit it.
Well yes. As I mentioned before, the button works fine when I use him on the main plane. It is the same button (dublicated).
edit:
I think the problem is the calculation of the mouse input. If anyone knows how to calculate the distance of the two vectors to pixel coordinates, I would be very thankful
How are you drawing your button: Unity GUI or your own code?
Is the MainPlane is a flat piece of geometry with a MainPlainTexture associated with it, and you are copying the RenderTexture into the MainPlainTexture?
OR -
Is the RenderTexture on a separate piece if geometry that floats above the MainPlane?
OR -
Is the RenderTexture being used to texture the whole MainPlain?
It seems your wanting to render your UI to a separate texture so you can manipulate it in cools ways, but you still want the user to be able to interact with the UI as if it where being drawn directly to the display, Is that correct?
I apologize if the questions overlap or you feel you’ve answered them already, I’m just trying to get a better idea of the problem, before I attempt a solution. Because I’m looking at it and thinking, “Well, depending on what he wants to do, it’s either a simple solution or a kinda’ tricky one” …
Unity GUI. I used the editor to drag the buttons into the scene. Both buttons work well, if they are on the main plane. Of course the button moved to the render texture won’t work anymore, as he doesn’t get the raycast
The main plane has a own background texture. Above the main plane, there is another plane with the render texture on it. The main plane should be unchanged, as the real game will be shown on the render texture.
Excactly! The user can click what he can see, but manipulates the scene which is far away.
No problem, I meant no offense (sry if it sounded so)! The problem seems to be a tricky one, if not you would spare me a hard work around
The trick was not to change the mouse coords, instead a second ray was made and its origin vector was translated by the delta distance.
With the second ray, I made a second raycast and looked if it hit something (this time on the render texture plane).
@ianjosephfischer if the first raycast hit the screen object than you get the hit.textureCoord and pass it as a parameter on the renderTexture camera’s method ViewportPointToRay
ray = RenderTextureCamera.ViewportPointToRay(hit.textureCoord);
Physics.Raycast(ray, out hit, Mathf.Infinity);
hit.textureCoord only return a proper value when it hits a meshcollider. In my test scene I have the rendertexture applied to a plane with a meshcollider with a quad mesh.