XR how to show oculus quest controller models on player hand location?

I am using the XR Plugins for Quest. I can test in play mode and such, and am just starting out, so bear with me if this is a silly question.

I have been trying to figure out how I can show a touch controller like the other Quest games. I understand it involves picking a prefab object on the XR Rig > LeftHand Controller (and RightHand Controller) objects. I get that I can put a sphere or other object and let the players see their hands that way. However, I am trying to figure out if there is a way to show a model/live feedback of the real Quest Touch controllers.

I have tried downloading the Oculus package from the Asset Store to comb it for models. Most of the controllers there are for the Rift or other models. (I only own a Quest, so not sure what those really look like, but they don’t match the ones I have in real meat space.)

The closest I can find are in an obscure folder, Oculus > SampleFramework > Core > TouchControllers > NewTouchControllers where there are two models of what look like my touch controller. However these are just 3d models, unshaded, and while they line up if dropped on LeftHand Controller, they don’t indicate any buttons or even look right since they lack a texture/material setting.

One directory above, there is a “LeftControllerPf” (and right one), but these look like Rift/other controllers when dropped into the scene.

The latest youtube stuff all points me to hand tracking and placing modeled hands, as does the asset store. This is great and all but for world design and logical control reasons I want the player to be holding Quest Touch controllers.

How do I get started helping the quest players see their controls? The quest games I’ve played (mostly tutorial or mild exploration games) all seem to have the same “3D model” with some animation capabilities for lighting up when a button is held/touched. For example, if I place my finger over a button most games will light up that button to let me know I’m touching that part of the controller in meat space.

I’d like to know how to recreate that level of controller feedback/representation for the users of my own game. Where can I get started? Surely everyone isn’t making these from scratch every time for every platform?

I’m willing to buy an asset pack if it solves this issue and lets me continue using the XR systems. Most packs however are not clear if they are built for the Oculus framework or XR input framework. If someone has a recommendation, I’m open to spend some cash to get an intuitive controller working.

3 Likes

After some further poking around, I have found Oculus > VR > Prefabs contains an OVRControllerPrefab. However, it seems to not be working (no matter what I set the controller to, the in game representation simply turns off all the controller models.) Adding an empty game object with an OVRManager script seems to make the controller show up, but introduces many other problems. I suspect because most of the OVRManager functions are clashing with the Unity XR calls. Doing so also disconnected my headset and crashed it out of the Link mode, so not sure which of these problems were caused by what to be honest. I’m not familiar with what all OVRManager is supposed to do besides being how the OVRControllerPrefab is reading the controller model.

Still feels like I’m stumbling in the dark here. Would appreciate any advice on the matter from anyone who has this stuff working. For now I think for my sanity I’ll just use some sphere placeholders and move on.

Yeah, the XR plugin and the integration don’t really work together at all, especially the avatar sdk (which most people use to get controller models).

There are lots of openended threads around here and on the oculus developer forums regarding this, either they are currently busy or just won’t give us the time time of the day regarding this.

I could try animating and creating the controllers myself based off those included models. But I don’t think I would be allowed to sell such a kit on the asset store since I would be using the open models from oculus kit.

The problem is that these controller representations are an important part of the Quest store publishing checklist. So if Unity is really expecting XR implementation to matter in the real publishing world, stuff like this needs to be easier to solve. Especially with quest version 2 on the way down the line. So my game publishing scope currently is only as a sideloading app for sidequest store and itch.io due to unintuitive controls, hardly a professional output worth contacting oculus Facebook about. :frowning:

Anyone know if this kit has a solution for tutorial controllers? VR Interaction Framework | Systems | Unity Asset Store

It’s description is a bit vague on what is actually contained or if it’s based in xr input code.

Has anyone found an elegant solution for this? I’m sure we’re just missing something…

I have been chasing the missing controllers for a few days now. I have a solution, if people are still looking.

The Oculus / VR / Scenes folder has a sample ControllerModels scene to check things quickly. This scene works.

What it appears to be is that you must select one controller model from the prefab. It starts with “mixed”, but doesn’t work and doesn’t indicate what is selected. Also, it appears that “all” doesn’t work either at the moment.

Oculus Integration is v 20.1

2 Likes

Has anyone made progress with this? I’ve just started prototyping using the XR Interaction Toolkit and don’t want to get too far into it if it’s going to be a problem showing the Quest models.

Exactly the same request as above :

I’d like to stick with XR Interaction Toolkit which sounds very promising.

For now, I found these options :

  • The Unity XR Toolkit basic RIG that is nice (sweet teleportation and interaction rays) except I have no controllers shown

  • The XR compatible asset Auto Hand - VR (currently sold at 13 bucks on the assetstore!) for an Alyx like hands with crazy physical poses but NO moving fingers like what Oculus or SteamVR SDK can natively handle

  • The VR-XR hands Github by VR With Andrew (with nice 3 youtube tutorials, thanks Andrew btw)

But the Holy Grail for a non dev like me would be :

  • The Unity XR Toolkit basic RIG with moving hands and a function to enable controllers / hands / both for HTC Controllers, Knuckles, Touch V1, Touch V2, Windows MR controllers. Or a least moving hands using the GRIP / Trigger / Joystick inputs.

EDIT : Found a good tutorial from Valem, with controllers and hands presence…
The whole code is on his Patreon for 5 bucks. took 30mn to solve this thanks to him.
The only two things are:

controllers won’t animate (like it does with SteamVR or Oculus SDKs)
hands are animated with GRIP and Trigger but it lacks some positions, like pointing something (like it does with SteamVR or Oculus SDKs)

Thanks for the feedback everyone. If you don’t mind, can I ask that you submit these request in our public roadmap so the team can properly track and prioritize in our planning? Thank you!

1 Like

I like many are just starting out on this journey with an Oculus Quest so this is how I did it.

First I watched Valems videos which are great and, if you are so inclined, include controller prefabs.

He also mentions the controllers you can download from here.

So I downloaded the package and dragged it into my unity project.

6771496--782824--upload_2021-1-27_15-40-10.png

I don’t know what the div0 div1 div2 refers to?

Next I dragged the div1 prefab into the hierarchy, right clicked it and chose - prefab-unpack.
Then I dragged the right_quest2_mesh to my assets-prefab folder making it a prefab
Then I dragged the left_quest2_mesh to my assets-prefab folder making it a prefab

6771496--782833--upload_2021-1-27_15-48-4.png

Then I deleted the quest_2_controllers_div1 object (with all child objects) from the hierarchy.

Now I have one left prefab and one right prefab.

Then in the hierarchy I opened XRRig-CameraOffset-LeftHandController and dragged the prefab into the Model Prefab field of the XR Controller (action based) component.

Did it work?

No

It took me a while but then I discovered I needed to add an Input Action Manager component to the XR Rig
6771496--782848--upload_2021-1-27_15-57-31.png

Now it works!

6771496--782830--upload_2021-1-27_15-46-22.png

7 Likes

absolutely amazed this isnt included properly in their integration, and that no one has made an asset for it. One scene does include controllers, and they animate too, but i cannot seem to find the controllers AND the hands, AND the laser.

Instead, to animate them, what needs to be done?