In my game, there are two screens each represented by two camer. The second camera holds the displays that will read touch input and have interactions. Both cameras look at the same coordinates, but using a layer system, there are exclusive rendering layers for each camera. I use the following to read touch input for objects there:
Vector3 pos = cam2.ScreenToWorldPoint(Input.mousePosition);
RaycastHit2D hit = Physics2D.Raycast(pos, Vector2.zero);
if (hit.collider != null) {
//do stuff here
Debug.Log(hit.transform.name);
}
This is where the problem occurs. When I attempt to touch the object on the second camera, no feedback returns. However, if I touch the same area on the first camera, the Debug shows up. Apparently, the raycast seems to be working on cam1, despite me referencing cam2. In addition, even if I disable the cam1 object entirely and use touch on its Game view, I still get touch debugs messages from it as if it’s still there. So the real problem here is, why is the touch raycast not working for on the second camera. The second camera is the only area where any touch input will be read during gameplay, so it’s valuable if this works.