OpenVR sample Input code?

Can somebody write a sample OpenVR code for Input?
if(“vive_controler_left_triger”)
if(“vive_touchpad,axes.y”)
if(Input.What???){}

lack of sample code in Unity documentaiton makes it hard to understand the whole concept of OpenVR

Yes please,

Just a few lines of example code of how to get the buttons, position, rotation of the controllers and trackers would be HIGHLY appreciated.

1 Like

Get VTRK, otherwise this is how you set it up…You basically go into your InputManager in your Unity project:

https://i.ibb.co/bNxTL79/tutorial-1.png

For me I didn’t have to make a difference between right and left triggers, but basically the three buttons you can use are your index trigger button, your hand trigger button and your menu button.

Index trigger and hand trigger are both instead of buttons floats 0 to 1, so if you want to have like an if (HandTriggerPressed()) or whatever you have to set that up yourself.

//I named it VR_LeftController_Horizontal but

//you’ll be naming it whatever you like in your Input Manager…

if (Input.GetAxis(“VR_LeftController_Horizontal”) == 1f) {

LeftHandTriggerPressed = true;

} else {

LeftHandTriggerPressed = false;

}

Ultimately, it will end up looking something like that in your Update() method. Then in whatever script you want you’d just access your class and use it:

if (LeftHandTriggerPressed)

MoveBox();

Or whatever.

These are good settings for your horizontal or vertical axis:

https://i.ibb.co/MMBcF9y/Horizontal.png

For the menu button you have to put in ‘Positive Button’ this:

joystick button 0

For the left hand controller I believe and then:

joystick button 2

For the right hand controller…

Remember that on Windows Mixed Reality and the Oculus the joystick is used while on the Vive the trackpad will be used in it’s place.

I’ll be honest with you, OpenVR is pretty great. While the input is just a tad more limited, the idea that you can just have it work across the Vive, Rift and WMR for us it’s just a godsend. Also, with my Lenovo Explorer my app will load almost instantly into my headset to playtest as a native windows app, instead of needing to change it to the windows universal platform.

In other words, everything is really instant, and it keeps the project much lighter weight than if you end up importing bulky packages and apis.

The low level code to get the raw data for OpenVR is something like this:

void OnNewPoses(TrackedDevicePose_t[] poses) {
    //Loop through each current pose
    for (uint i = 0; i < poses.Length; i++) {
        //Query SteamVR for controller position & rotation (aka pose)
        var pose = poses[i];
        var worldPose = new SteamVR_Utils.RigidTransform(pose.mDeviceToAbsoluteTracking);
        //Valid tracked device at this Index?
        if (pose.bDeviceIsConnected && pose.bPoseIsValid) {
            var deviceClass = OpenVR.System.GetTrackedDeviceClass((uint)i);
            if (deviceClass == ETrackedDeviceClass.Controller) {
                //Get Rotation and Position Data
                var pos = worldPos.pos;
                var rot = worldPos.rot.eulerAngle;
                //Get input state
                VRControllerState_t state = new VRControllerState_t();
                OpenVR.System.GetControllerState(index, ref state, (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(VRControllerState_t)));
                print(GetPress(ButtonMask.Trigger))
            }
        }
    }
}


static bool GetPress(ulong buttonMask) { return (state.ulButtonPressed & buttonMask) != 0; }
static bool GetPressDown(ulong buttonMask) { return (state.ulButtonPressed & buttonMask) != 0 && (prevState.ulButtonPressed & buttonMask) == 0; }
static bool GetPressUp(ulong buttonMask) { return (state.ulButtonPressed & buttonMask) == 0 && (prevState.ulButtonPressed & buttonMask) != 0; }

public static class ButtonMask
{
    public const ulong System = (1ul << (int)EVRButtonId.k_EButton_System); // reserved
    public const ulong ApplicationMenu = (1ul << (int)EVRButtonId.k_EButton_ApplicationMenu);
    public const ulong Grip = (1ul << (int)EVRButtonId.k_EButton_Grip);
    public const ulong Axis0 = (1ul << (int)EVRButtonId.k_EButton_Axis0);
    public const ulong Axis1 = (1ul << (int)EVRButtonId.k_EButton_Axis1);
    public const ulong Axis2 = (1ul << (int)EVRButtonId.k_EButton_Axis2);
    public const ulong Axis3 = (1ul << (int)EVRButtonId.k_EButton_Axis3);
    public const ulong Axis4 = (1ul << (int)EVRButtonId.k_EButton_Axis4);
    public const ulong Touchpad = (1ul << (int)EVRButtonId.k_EButton_SteamVR_Touchpad);
    public const ulong Trigger = (1ul << (int)EVRButtonId.k_EButton_SteamVR_Trigger);
}

OnNewPoses is called automatically via an event dispatched from the OpenVR SDK, just stick it in any monobehavior.

There are other ways to get the info, using Valve’s helper components, checking the Inputs, but at a low level, this is what’s happening under the hood.

1 Like