Full disclosure: I am a complete Unity noob.
I think I can get the position of the mouse/pen/touch using a Position action, which returns a Vector2 that I can read from the context. And I think I can get a click/touch using a Left Button action, which returns a float (I presume it’s the “button axis” value). What I really want to do is respond to a mouse click/pen touch/screen touch by looking at the position, since I don’t really care about the “button value”.
After much searching, I wondered whether a Composite action would do what I want, so I tried one that combines a Left Button with a Position, but I can’t seem to get the types set correctly:
I tried to set the top-level Action Type as Pass Through, but that didn’t work. So I changed it to Vector2, but the code still complains:
Obviously, I’m doing something wrong, but I have no idea what. Any help would be appreciated.
I realize that I could just use the Button event and look at the mouse directly, but then I would have to look at the mouse and the pen and the touchscreen, and I thought the whole point of the new InputSystem was to abstract all those for me via events, so I’m trying to do things “The Right Way”, without really being sure what that means.