VR UI Slider Issues

Hello! Using the Unity XR Plugin for Oculus Quest.
Everything works great, Except for Slider UI.
Theres nothing in the way of the Slider, no ui ontop of it, or objects in the way, and it does highlight when i hover over it. But when i click to drag the slider, nothing happens. However sometimes it does move, but it snaps to the start or end of the slider i can never get it in the middle.Other UI works like Dropdowns, Buttons and so on.
Just Sliders.

Any ideas?

Yup, same problem here. It appears to be a fairly recent bug, since I’ve seen videos of people using sliders with the XR Interaction Toolkit before.

You can click on the bar part, but if you click on the thumb part you can’t drag it.

I’ll file a bug report on it.

1 Like

Thanks! For me even the bar part is Highly unreliable, 90% of the time it does nothing, and when it does do something it snaps to either the end or start.

Hi @BernieRoehl , can you share the case # so I can make sure it’s in our queue? Thanks!

Can someone help, how to make ui sliders work for htc vive. I am using steamvr 2.0.
With laser pointer nothing happens. Will be a great help if someone can give some inputs.

So, Vive support is actually directly handled by HTC. Please share this on their developer forums and their teams will help.

Chiming in with the same issue. Using the latest XR Interaction Toolkit 1.0.0-pre.4 and Oculus XR Plugin 1.9.1 / OpenXR Plugin 1.1.1. Neither sliders nor scroll rects work properly.

Same slider issue here too. Any updates on a fix?

Never managed to fix it, just worked around it for my game, not using sliders.

I confirm the problem. Also, the buttons sometimes refuse to work.

This means the buttons that are located in the ScrollView.

If you are using the XRUIInputModule you may need to crank up the Tracked Device Drag Threshold Multiplier
By default, the UI system considers something a drag and not a tap when it moves 10 pixels in UI space for 2D input (mouse & touch), and 20 pixels for tracked controllers. Since UI can be far away, and controller input can be jittery, it feels like a much tighter threshold for tracked devices than for 2D. Once it’s a drag and not a tap, I suspect the scroll view is taking ownership of the event and consuming it.

The Event System has a Drag Threshold value that sets the initial 2D value, and then for tracked devices it’s multiplied by the Tracked Device Drag Threshold Multiplier. Cranking that number up is a compromise though, you’ll get more reliable clicks, but you may not get as responsive drags, so you may need to find a personal sweet spot that fits your game design or rig setup.

Could also be something else, but I feel like this is at least a thread to pull on.

1 Like

Thanks! this gave the result

I confirm that the UI slider doesn’t work at all when I “Play” inside the Unity editor.
However when I export either for PC, then it works just fine. So it must be a bug within the Unity editor.

(Also, changing Tracked Device Drag Threshold Multiplier as proposed earlier does nothing to solve the UI Slider issue).
When can we expect it to be fixed?

@Leoss
You are right, I was only responding to the person above, and not the main reported issue in this thread.

As for sliders, I started looking at it today, and I know why it behaves that way, but I don’t know if I have a fast fix for you at the moment.

TL;DR: It’s always been a bug, but there are some situations where it works correctly (UI backing such that the pointer doesn’t fade off to infinity between UI elements), so you may have seen it working well in other demos. Please file a bug and ideally attach this thread/info in case it’s not me that ends up looking at it.

The reason sliders don’t behave as well in VR has to do with where pointers start from vs. screen-based input. Mouse and Touch input always originates at the screen, and so when you raycast into the world, the position on the screen is the same 1m, 5m, 100m away from the origin. The slider knows where in screen-space, the cursor is so it can figure out how far along the slider is. E.g. in this first image, it’s really easy to figure out where the scroll cursor should be even though it’s not on the actual slider:
7300222--883837--upload_2021-7-5_11-24-15.png

For VR, it’s different. The origin of the pointer is totally separated from the view. This means that the screen-space location of the cursor is dependent on how far along the ray we are checking. For example in this image, where the red line is the pointer ray:

The screen space position of the pointer could be anywhere along that red line, depending on how deep in the world you want to check. That’s just not the case with mouse and touch.

Normally, in VR, I solved needing screen space in UI events by finding the hit point of the XR pointer and getting it’s screen space. This is backwards from mouse and touch where we have the screen space position and use that to find the world space hit point. This means the slider will work well so long as there is some other UI object being hit on the same plane as the slider itself, keeping a consistent depth and therefore screen space position. But when you hit gaps that pass through to infinity or other long distance UI, it starts to misbehave.

I don’t have an immediate fix for this, sadly. The XR UI system was setup to try to integrate as seamlessly with the existing UI as possible so we didn’t need XR Slider, XR Dropdown, etc…, but in this situation, I may need to modify the slider to find a good way to adjust how screen space is determined for pointers, though I think this second option would break UI elsewhere.

2 Likes

Thank you for this insightful reply @StayTalm_Unity . We have a similar situation and your explanation has helped clarify some aspects of it.

In our VR app, our main UI exists all together on a single panel. This panel is buried deep underground, out of the way, and a render texture displays this UI on several “TV screens” scattered about our play space. We detect interactions with the TV screen, and put a second XR Ray Interactor on our (underground) UI at the exact same place. Sounds hacky, but it works great! Only issues so far are:

  1. Multiple XR Ray Interactors don’t reliably send button actions on clicks
  2. Scrollviews and Slider bars have unreliable characteristics

Item 2. is exactly the scenarios we’re discussing here. I’ve noticed that if I select+hold a slider or scrollview, and move my head left/right, the slider/scrollview will move about as well! This is highly exacerbated by having a UI several 100m’s underground, which I now understand much more clearly, tyvm.

Even though it’s undesirable, in order to ship reliable VR UI to our customers, do you have a quick recommendation for building and interim XR Slider and XR Scrollview? I love where the XR kit is going, and think it’s absolutely the right idea. But explaining the complications of these problems doesn’t satisfy customers who just want functioning VR UI :smile:

100% with you. Theory is nice, but even if we have bugs, you’ve still got to ship.

So, I attached an XR Slider. Mostly copy-paste, hacking out some Internal calls, and modifying a single function.

That function is UpdateDrag:

        void UpdateDrag(PointerEventData eventData, Camera cam)
        {
            RectTransform clickRect = m_HandleContainerRect ?? m_FillContainerRect;
            if (clickRect != null && clickRect.rect.size[(int)axis] > 0)
            {
                Vector2 position = Vector2.zero;
                if (eventData is TrackedDeviceEventData trackedDeviceEventData)
                {
                    clickRect.GetWorldCorners(s_Corners);
                    var plane = new Plane(s_Corners[0], s_Corners[1], s_Corners[2]);
                   
                    var rayPoints = trackedDeviceEventData.rayPoints;
                    for (var i = 1; i < rayPoints.Count; i++)
                    {
                        var from = rayPoints[i - 1];
                        var to = rayPoints[i];
                       
                        var rayDistance = Vector3.Distance(to, from);
                        var ray = new Ray(from, (to - from));
                       
                        if(plane.Raycast(ray, out var distance))
                        {
                            if (distance < rayDistance)
                            {
                                var worldPoint = ray.origin + (ray.direction * distance);
                                position = cam.WorldToScreenPoint(worldPoint);
                               
                                Debug.DrawLine(from, worldPoint);
                                break;
                            }
                        }
                    }
                }
                else
                {
                    position = eventData.position;
                }
               
                Vector2 localCursor;
                if (!RectTransformUtility.ScreenPointToLocalPointInRectangle(clickRect, position, cam, out localCursor))
                    return;
                localCursor -= clickRect.rect.position;

                float val = Mathf.Clamp01((localCursor - m_Offset)[(int)axis] / clickRect.rect.size[(int)axis]);
                normalizedValue = (reverseValue ? 1f - val : val);
            }
        }

I cut out multi-monitor support (used Internal calls), and everything inside of if (eventData is TrackedDeviceEventData trackedDeviceEventData) is custom. What I do is create create a plane struct that represents the slider button, and then figure out where the pointer ray intersects that virtual plane. I use that to determine my screen position, and then fall through all the same logic as the normal slider. The scrolling area should be very similar, do a search for eventData.position, and replace that value with the raycast against plane code above.

However, this only fixes half the issue. The positioning is now correct, but it doesn’t have the same behaviour as a mouse where it still calculates the position of the scroller even if I’m not pointing at anything. It just sort of stops whenever I point into open space.
I traced this down to the UIInputModule. There is some code in the internal void ProcessTrackedDevice(ref TrackedDeviceModel deviceState, bool force = false) that is broken on drag. If you look at this blob:

                Vector2 screenPosition;
                if (eventData.pointerCurrentRaycast.isValid)
                {
                    screenPosition = camera.WorldToScreenPoint(eventData.pointerCurrentRaycast.worldPosition);
                }
                else
                {
                    var endPosition = eventData.rayPoints.Count > 0 ? eventData.rayPoints[eventData.rayPoints.Count - 1] : Vector3.zero;
                    screenPosition = camera.WorldToScreenPoint(endPosition);
                    eventData.position = screenPosition; // <------- This line right here needs to be deleted
                }

                var thisFrameDelta = screenPosition - eventData.position;
                eventData.position = screenPosition;
                eventData.delta = thisFrameDelta;

I put a comment on the line that needs to be removed. It’s shorting out the calculation of eventData.delta directly below it. This brings back the ability to update drag while pointing at nothing. This one is a little awkward for you to patch in as you’ll also need to bring in a custom XRUIInputModule and I believe a lot of XRI is internal. You may be able to clone the package and put it locally in your packages folder if you just want to make custom modifications directly.

Please report both these as bugs. I know these systems, and want to help, but I’m working on other things, and doubtful I’ll be the one tackling these issues directly.

Hope this helps!

7300807–883936–XRSlider.cs (29.8 KB)

1 Like

Big thanks @StayTalm_Unity . I had spoken with some customers who were aware of this situation and they are quite happy with the fix. I meant to circle back a while ago, but better late. Thanks so much!

After reading all of this thread I noticed I forgot to add the correct Event camera on the Canvas that the UI Slider was attached to. I had it set to a disabled camera on accident.

1 Like

I’ve also encountered the same problem while working with Sliders but I think I’ve found a quick fix but before that let me give some context…First, if I understood @StayTalm_Unity I think that’s the issue(IMO not that I am experienced). If you are like me, then you usually create a canvas in WorldSpace, set the Width and Length on the RectTransform component of the canvas, then when you add a slider to it, you scale it down instead of working with other RectTransform parameters.
So my fixes:

  • At the start, you could just scale your canvas instead of setting its width and height. I noticed this is how Valem did it in his Introduction to VR in Unity series Part 6. But if you have progressed far beyond this point…
  • Create another World Space canvas and configure it as you so please(or based on the previous one) and most importantly leave the width and height as is but change the scale. Then copy the slider GameObject from the old Canvas to the new while maintaining its right location in WorldSpace. You can disable or delete the old slider if you so wish. The 2 canvases do not have to be in the same location though.

I’ve implemented option 2 and it seems to work for me so far. Hope someone else finds this useful before a better fix is provided.

1 Like