AR Foundation simulation

Hello everyone,

I’ve been working with AR Foundation for some time now, and as u might agree, building and running to device every time was a real pain, until I met the XR simulation which is a real useful tool, so huge thanks for that, and every effort that goes into it.
But there goes my problem, as i was working on some Input System, i lost the ability of hovering in game mode through the xr environment and can’t get it to work and can’t find no clue anywhere on the internet. Did I unintentionally deactivated anything? or any stupid misunderstanding? cuz it seems to be touch and go…

so if i rephrase my question in a better way actually, how does this simulation tool get my button inputs?

Ok so as the saying goes, sometimes u just have to say ur question out loud…

I’m using UI builder for my ui design, and seems that i mistakenly added a “Touch Simulation” component to it’s game object, which i neither know what it does, nor why it interferes with simulation’s action button. So even though my last question stays in my head, tell me if i should delete this post.

Thanks unity, hope u thrive.

1 Like

We recommend using InputSystem.Pointer for touch input detection. Pointer resolves to a touch input on mobile and mouse input on desktop, so it’s ideal for input that works on both mobile and XR Simulation. Here’s our code for how we set this up in our sample app: arfoundation-samples/Assets/Scripts/Runtime/PressInputBase.cs at main · Unity-Technologies/arfoundation-samples · GitHub

For more advanced interactions we recommend XR Interaction Toolkit (XRI) 2.5 or newer: AR Interaction Overview | XR Interaction Toolkit | 2.5.4

The XRI team recently released some new input components in 2.5 that also work well in XR Simulation.

Hey bro,
Thanks for your accurate answer, will surely the links u mentioned and thanks for ur perfect samples.
cheers

1 Like