Mixed reality controllers won't give position

I have been using the interaction manager to connect the controllers (which are tricky as it appears I have to push windows button, go to mixed reality portal and then back to unity for it to connect, any clues how to make it connect are appreciated) and it doesn’t seem to matter what I have the SourcePos never returns a position.

Can anyone get this work and if so what is the trick?

Can you give us more information on your script please?

I can get the control to detetect and detect button presses (but only if I press windows button on controller, go to mixed reality portal and then back to unity which doesn’t seem right)

I can’t get any position information.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.WSA.Input;

public class ControllerInput : MonoBehaviour {

    // Use this for initialization
    void Start () {
      
        InteractionManager.InteractionSourceDetected += InteractionManager_SourceDetected;
        InteractionManager.InteractionSourceUpdated += InteractionManager_SourceUpdated;
        InteractionManager.InteractionSourceLost += InteractionManager_SourceLost;
        InteractionManager.InteractionSourcePressed += InteractionManager_SourcePressed;
        InteractionManager.InteractionSourceReleased += InteractionManager_SourceReleased;
    }

    void OnDestroy()
    {
      
        InteractionManager.InteractionSourceDetected -= InteractionManager_SourceDetected;
        InteractionManager.InteractionSourceUpdated -= InteractionManager_SourceUpdated;
        InteractionManager.InteractionSourceLost -= InteractionManager_SourceLost;
        InteractionManager.InteractionSourcePressed -= InteractionManager_SourcePressed;
        InteractionManager.InteractionSourceReleased -= InteractionManager_SourceReleased;
    }
        // Update is called once per frame
    void Update ()
    {
        //Vector3 leftPosition = InputTracking.GetLocalPosition(XRNode.LeftHand);

        //Debug.Log("LP" + leftPosition);


        var interactionSourceStates = InteractionManager.GetCurrentReading();
        Debug.Log("Num Interaction Source States: " + interactionSourceStates.Length);
        /*
        if (interactionSourceStates.selectPressed)
        {
            Debug.Log("Press" + Time.time);
        }
        */
        if (Input.GetButton("Fire1"))
        {
            Debug.Log("Fire1");
        }
      
    }

    void InteractionManager_SourcePressed(InteractionSourcePressedEventArgs args)
    {

        Debug.Log("Source pressed");
    }
    void InteractionManager_SourceDetected(InteractionSourceDetectedEventArgs args)
    {

        Debug.Log("Source detected");
    }
    void InteractionManager_SourceUpdated(InteractionSourceUpdatedEventArgs args)
    {
        Vector3 p;
        //float a;
        args.state.sourcePose.TryGetPosition(out p);
        //a = args.state.sourcePose.positionAccuracy;

        //Debug.Log("Source updated " + p);
    }
    void InteractionManager_SourceLost(InteractionSourceLostEventArgs args)
    {

        Debug.Log("Source lost");
    }
    void InteractionManager_SourceReleased(InteractionSourceReleasedEventArgs args)
    {

        Debug.Log("Source released");
    }

}

Should I have the mixed reality portal open? I wondered if that was the issue but I am not sure how to start the headset without it.

I seem to be able to get all the button presses etc. I am just struggling to get the position of the controller :frowning:

Also args.state.sourcePose.positionAccuracy returns “none” so I assume that means it isn’t getting any data. Is there something i need to do to enable it?

I tried

Vector3 leftPosition = InputTracking.GetLocalPosition(XRNode.LeftHand);

            Debug.Log("LP" + leftPosition);

            Vector3 rightPosition = InputTracking.GetLocalPosition(XRNode.RightHand);

            Debug.Log("RP" + rightPosition);

and still no luck.

Interesting once I tried starting before it had run the setup and it seemed to actually read positions in, however I tried to replicate and can’t.

What build of unity are you using? Are you using the MRTP builds?

Current release build 2017.2.0f3

I don’t have access to MRTP builds

edit: I figured how to access those builds and it now works fine. Guess it is just 2017.2 that is broken.

edit2: new Problem. The controllers track but are signifcantly offset. Is that normal?

Controllers should track correctly. What exactly do you mean by “significantly offset”?

If I use the position reported to place a 3D object (a sphere) at the point it reports by simple assignment it appears off in the distance. However when I move the controller right the ball in the distance moves right. So it appears to be tracking but it is nowhere near where my hand actually is (and it is reporting high accuracy).

That sounds like a bug, haven’t seen that myself. Would you mind filing a bug for us?

I will do that as soon as I get back from work.

Can I confirm I should just be able to poll the position and apply to an object and it appear where the controller is?

Also can I confirm the main camera should start at 0,0,0?

There isn’t a whole lot of examples(do you have a working example?) so I wanted to check I am not doing anything wrong.

Yes, you should just be able to apply the position to an object and have it appear where the controller is, regardless of tracking space type.

The main camera should start at (0,0,0) if your tracking space type is stationary (or if you’re running on a HoloLens), which we fall back to if the floor can’t be found at start-up. Normally, though, the default tracking space type is room-scale, which places the floor at y=0, so your camera would have a y-value of however many meters up in the air it is.

But like I said, the controller positions should be reporting in the right space regardless of tracking space type, so having your controllers appear way out in the distance in either case is definitely wrong.

I don’t know of any example of displaying controller objects in the scene, I’ll add that to my to-do list.

Okay thanks.

If I am trying to do roomscale where should the camera be?

The exact position depends on how far the headset is from where it was when the user ran setup in Mixed Reality Portal and clicked the “Center” button. If the HMD is exactly where it was during the setup’s “Center” button-click, you should get a position of (0,y,0), where y is how many meters above the floor the HMD is. The x and z coordinates should be the number of meters away from that starting position in either direction.

The default orientation should have the HMD facing towards your monitor(s) when “Center” was clicked, so your main camera’s rotation should be based on that, regardless of where it’s looking at app startup. For example, if you place a cube 5 meters down the z-axis and you look way to the left of your monitor(s) (or wherever you clicked the “Center” button during setup), you would need to look to the right a bit (towards your monitor) to see the cube.

So basically I should have to do nothing special and Unity will handle that?

Is there a way get the boundaries?

Thank you for your time replying, it is great get some info rather than guessing.

Correct, everything I mentioned above should be taken care of automatically by Unity.

You can in fact get boundaries - we still haven’t moved that API out of the Experimental namespace, and to my knowledge, it hasn’t been documented… we really should take care of both of those. But the API you want is:

using UnityEngine.Experimental.XR;
// ...
List<Vector3> listOfPointsYouWantPopulated = new List<Vector3>();
Boundary.TryGetGeometry(listOfPointsYouWantPopulated, Boundary.Type.TrackedArea);

Let me know if you need anything else!

well I finally got it working. Thought I would include the code for anyone in the future.

It is still not overly reliable but at least it tracks. Also it actually shows the mesh for my roomscale no matter if I want to or not.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.WSA.Input;

public class ControllerInput : MonoBehaviour {

    public GameObject leftHand;
    public GameObject rightHand;


    // Use this for initialization
    void Start () {
       
        InteractionManager.InteractionSourceDetected += InteractionManager_SourceDetected;
        InteractionManager.InteractionSourceUpdated += InteractionManager_SourceUpdated;
        InteractionManager.InteractionSourceLost += InteractionManager_SourceLost;
        InteractionManager.InteractionSourcePressed += InteractionManager_SourcePressed;
        InteractionManager.InteractionSourceReleased += InteractionManager_SourceReleased;
    }

    void OnDestroy()
    {
       
        InteractionManager.InteractionSourceDetected -= InteractionManager_SourceDetected;
        InteractionManager.InteractionSourceUpdated -= InteractionManager_SourceUpdated;
        InteractionManager.InteractionSourceLost -= InteractionManager_SourceLost;
        InteractionManager.InteractionSourcePressed -= InteractionManager_SourcePressed;
        InteractionManager.InteractionSourceReleased -= InteractionManager_SourceReleased;
    }
        // Update is called once per frame
    void Update ()
    {

       
    }

    void InteractionManager_SourcePressed(InteractionSourcePressedEventArgs args)
    {

        Debug.Log("Source pressed");
    }
    void InteractionManager_SourceDetected(InteractionSourceDetectedEventArgs args)
    {

        Debug.Log("Source detected");
    }
    void InteractionManager_SourceUpdated(InteractionSourceUpdatedEventArgs args)
    {
        Vector3 p;
       
        args.state.sourcePose.TryGetPosition(out p);
       // string acc = args.state.sourcePose.positionAccuracy.ToString();

        Debug.Log("Hand " + args.state.source.handedness.ToString());

        if (args.state.source.handedness.ToString() == "Left")
            leftHand.transform.position = p;
        else if (args.state.source.handedness.ToString() == "Right")
            rightHand.transform.position = p;

        Debug.Log("Source updated " + p);
    }
    void InteractionManager_SourceLost(InteractionSourceLostEventArgs args)
    {

        Debug.Log("Source lost"); 
    }
    void InteractionManager_SourceReleased(InteractionSourceReleasedEventArgs args)
    {

        Debug.Log("Source released");
    }

}

Actually still buggy. But I think I have figured the cause. If the mixed reality headset hasn’t loaded the tracking once it does the controllers are off. If I restart a few times eventually one of the times it will track correctly. It is like I need to reset the controllers on load, but not sure how to do that.

I did use the report a bug (962382).

Yup, I’ve noticed similar, basically getting the controller position through the interaction manager using TryGetPosition on an InteractionSourcePose stops returning positions in the Editor despite everything else working correctly. Tracking is correct in the home space and in the Editor when it does actually return positions.