Using the Unity Event System for VR

Hello!

I really like how the GoogleVR SDK easily utilizes events like PointerEnter, PointerExit, and PointerClick, and was hoping to extend that functionality to a desktop VR experience that I had already built. Unfortunately, it looks like PointerClick doesn’t follow to the typical “Fire1” input mappings beyond a mouse click, so I can’t get PointerClick to respond to a button pressed on the VR game controller.

I’m trying to keep my scripts as simple as possible since I plan on using this project to teach others. Does anyone know the easiest way to either:

  1. Extend the functionality of PointerClick to accept additional inputs (like, say, the B button on an Oculus Touch controller)
    or
  2. Create a new event that can be handled by the Event System (like, say, VRPointerClick which could then map to everything Fire1 does as well as GVRController.ClickButton.

Thank you!

I think what you need to do is to extend the input module of the eventsystem, because it’s the part do the mapping things. Check Unity docs for more.

Once you find what input the VR controller responds to, you can manually inject a PointerClick event via script. I show at in this tutorial at about 13:30:

1 Like

Thanks @Selzier ! Love your youtube channel btw. That line of code about the Execute command definitely helps me a lot. However, I’m trying to have something that works across mobile and desktop within the same scene, so as far as I know, using the GVR SDK isn’t an option since a lot of that functionality depends on having your platform set to Android.

If I don’t use the GVR SDK, then can I achieve similar results with the standard Unity Event System and a Physics Raycast component to read PointerEnter and PointerExit? I just tested and it seems to almost work… but the raycast seems very inconsistent. and definitely not centered on my vision. It also seems like Unity by default uses the location of the mouse in the Game window to read PointerEnter and PointerExit. Am I missing something?

Thanks!

1 Like

You would need to write you own custom Input Module (gaze input module) to replace the Standalone Input Module from Unity. That’s what I did with Mobile VR Interaction Pack, so I don’t have to use GoogleVR SDK anymore:

That does not require Android or anything, it should work with any Unity project. I don’t have a Vive or Rift so I can’t test with those devices but it should function the same, casting a ray out from camera’s position for PointerEnter/Exit events, etc.

Different Unity versions have different positions for “center of screen” when VR mode is enabled in build options, so you may need to get center of screen position via screen with and height, or VREyeTexture width and height, just depends on Unity version and if VR mode is enabled or not.

1 Like

Gotcha. Thanks so much for the help!

As far as the mouse goes, if you still have the standalone input module on your event system, the mouse will still drive your pointer events from the main camera. Writing your own input module and removing the standard one would prevent this from happening. You’ll off course need to raycast from either the controller or your head instead, and call the executeevents for the pointer events yourself.

1 Like

Ahhh that helps to clarify a few things. Thank you @greggtwep16 !

Have you taken a look at the raycast/interaction scripts included in the Unity VR Sample pack to see how they handled this?

Thanks. Just what I was looking for. Now I need to figure out how to get the click action to work with several different UI and 3D object raycast systems.

2 Likes

In ExecuteEvents.Execute I was able to use

ExecuteEvents.Execute(gameObject, new OVRPointerEvents (EventSystem.current), ExecuteEvents.pointerClickHandler);

which I assume replaces the Unity pointer with the HMD’s pointer. If I want to simply click the button or raycast to the 3D object I am looking at by pressing A on the gamepad how is that done?

1 Like