Raycast hit done wrong using ScreenPointToRay coordinates

Hello guys, after two days of search I finally decided to give up on my attempts to make this work and I decided to ask for help from those who most certainly know more about me :stuck_out_tongue: What I am trying to achieve is simple in its core mechanics, but still is driving me mad: my final goal is to change some objects attributes by clicking on them in the Scene Editor window, so far I managed to make it work only on some clicks, but most of the times it works even when I do not click the right position.

So far I have a simple plane mesh (imported from blender), with a script attached to it that makes it blue when I click it. The plane is scaled on both x and y axis and has a box collider that fits the mesh (checked both in ortho and perspective mode):

using UnityEngine;
using System.Collections;
using UnityEditor;

[ExecuteInEditMode]
public class TileHandler : MonoBehaviour {
	public void colorChange () {
		GetComponent<Renderer>().sharedMaterial.color = Color.blue;
		Debug.Log("Color changed");
	}
}

In the Editor folder I have another script that checks whether the mouse click hits the target or not, and call the proper method in case.

using UnityEngine;
using UnityEditor;

[InitializeOnLoad]
[CustomEditor(typeof(TileHandler))]

public class MouseHelper : Editor
{
	//static TileHandler tile;
 	//public override void OnInspectorGUI() {
        //tile = (TileHandler) target;
    //}
    
    static MouseHelper()
    {
        SceneView.onSceneGUIDelegate += UpdateView;
    }
 
    private static void UpdateView(SceneView sceneView)
    {
        if (Event.current.type == EventType.mouseDown){
            RaycastHit hit; 
			Ray ray = Camera.current.ScreenPointToRay(Event.current.mousePosition); 
			if(Physics.Raycast(ray, out hit, 20000)){
				TileHandler g = (TileHandler) hit.transform.gameObject.GetComponent<TileHandler>();
				Undo.RecordObject(g.GetComponent<Renderer>().sharedMaterial, "Undo Color Change");
				g.colorChange();
				EditorUtility.SetDirty(g);
			}
        }
    }
}

I know the code is not optimized, but i was just trying to make it work. Can you spot what am I doing wrong? The scene is a 3D one.

You mix GUI coordinates with Screen coordinates. Screen coordinates start at the bottom left while GUI coordinates start at the top left. So don’t use “Event” data which belong to GUI space in ScreenPointToRay.

ScreenPointToRay should be used at runtime with Input.mousePosition which is in ScreenSpace and does start at the bottom left.

In the editor however you should simply use HandleUtility.GUIPointToWorldRay which does pretty much the same as “ScreenPointToRay” but works with GUI coordinates.

You might also want to have a look at HandleUtility.PickGameObject which is able to pick objects which don’t have a collider attached. This is what the editor uses for it’s native object selection.

Apart from that you really should consider what @Eudaimonium said in the comment above. If you click on an object that doesn’t have your “TileHandler” script attached your code will throw a NullReference exception.

Thanks, this is exactly what I needed, I was pretty sure I was messing up with the API, but I couldn’t find the right function by myself. Right now I only have a grid made of these objects that are supposed to change color on click in the editor, now that I fixed it I’ll continue with the procedural level generation. I will definitely follow both yours and @Eudaimonium suggestions, thanks a lot guys!