In game controls in VisionOS simulator

We’re working now to convert an Oculus game to VisinoOS headset. We are stuck at how to give the simulator an input to control the character. It seems that the keyboard is not taken into account, even though we change from the upper tab I/O options.

We simply cannot move the in game character with anything. We searched the forums but we didn’t find any solution. Does anybody here know how to give input to the character through keyboard in VisionOS?


Have you looked at the samples in com.unity.xr.polyspatial? There is a CharacterWalker sample scene that shows how to use the pinch/gaze input to direct a character to walk around the scene. It’s pretty basic, but it should be a good starting point. This example is set up for Mixed Reality, but the concept should still work in Virtual Reality.

As for why keyboard input isn’t coming through in the simulator, it sounds like you’re trying to use the keyboard on your laptop? I’m not sure why that’s not coming in through Unity. It might be a bug, but it depends on how you have implemented keyboard controls in your app. If you would like to submit a bug report with a project that can reproduce the issue, we’ll take a look.

What is the expected behavior for users on the Vision Pro device? Do you want them to use the virtual keyboard or connect a physical Bluetooth keyboard for input? We fixed some issues with the virtual keyboard in Unity 2022.3.18f1 and the latest PolySpatial packages (version 1.0.3). Requiring a physical keyboard might be an unexpected constraint, and you could run into trouble getting the app approved for App Store. For this reason, I would recommend using gaze and pinch for input.

Thank you for replying. This is a racing game, in which you should drive a car via some controls. In Oculus we use the device’s controllers. We’re trying to adapt this app for Vision Pro and right now we’re trying with a bluetooth keyboard, but we’ll test with a game controller soon. The pinch teleportation would not work in our case. We’ll take a deeply look into our input system and come back to you as soon as possible if we don’t find anything wrong with it. In editor everything works, but in simulator is just like the keyboard doesn’t exists.

Yeah, it’s odd that keyboard isn’t working. I can look into it, but it might also be worth submitting a bug to us so we can track it. It could be an Apple bug, but certainly possible it’s on our side. I’ve had some luck using a bluetooth keyboard with the device but I haven’t tested this kind of setup in a Unity app.

Thanks for the context. Indeed a racing game would work very differently from our Character Walker example. :slight_smile: Gamepad controls seem perfect for this use case, and I think someone on our team has tested a Playstation controller in the simulator and on device. With that said, keyboard should still work, so it’s worth looking into on our side.

It might sound kind of silly, but you may still want to try using hand tracking for your game. Something like “hold your hands at 10 and 2” and pretending like you’re holding a steering wheel could be fun, and maybe just tilting the hand from side to side could control steering? Even if you’re using bounded mode, you can use the deviceRotation control on SpatialPointerDevice to get hand rotation, which may be expressive enough for a casual karting game.

1 Like

We implemented already the steeringwheel holding in Oculus. This should be the easy part converting here and we’ll do that later. The hardest part is still playing with the controls.
Another question, we managed to get some controller input from a PS5 controller but it is not working with the new input system. It is working only by getting the reference in script from Gamepad class and read the value in Update. It is worth mentioning that I chose the right button map because in Editor with the new input system it’s working. The new input system is under development?

Huh… that’s odd that it isn’t coming through the new input system. I think @DanMillerU3D is the one who tried this on our side. Dan, did you use the input system package when you tested gamepad support?

Can you tell me what type of experience you are building (windowed, VR, MR)? I was testing windowed mode using the latest URP simulator and am able to get the gamepad input through the input package (new input system) using binded gamepad references.

You mentioned the simulator setting, just confirming that you selected this option too.

We are converting an Oculus racing game. This is fully immersive (VR). Yes, I did everything mentioned by you. I also just opened the sample scenes from the latest package for visionOS and I added there the Input System. I thought that maybe something from our game is interfering with this input system, but in the sample scene, also didn’t work. The controller is connected via bluetooth.
I also checked the system preferences from the mac to see if the Xcode is able to take inputs from external devices and it does, I’m not sure why the input is not recognized.

We got the input from Gamepad class in the simulator so things are kind of working, but couldn’t get the input with the new input system. We’ll continue on that path if the input system doesn’t work for us.

Ok, I came back to you. I saw in your example that you used the [Gamepad] for inputs. In the new input system we used everything but this one. I don’t know what was in my mind. It was obvious even in the code we used Gamepad class, also it was working in editor but not in simulator with [PlayStation Controller] or [DualSense Controller] and I assumed something is wrong. Testing with the [Gamepad] as input device worked perfectly.
Thanks guys for help!

1 Like

Hi, just want to follow up on this topic.

So our use case is to simply use below, but didn’t work:
if (Input.GetKeyDown(“a”)) {do something}

such that we can trigger some functions and take snapshots from Vision OS Simulator (Can probably do same in Unity Editor, but Vision OS Simulator looks better :grin: ) for app listing.

The reason why we need the trigger is because in actual play it requires ARKit data to proceed, but Vision OS Simulator doesn’t pass ARKit data, so we try to fake it. (And capture snapshots from actual device is even harder, I think you need to mirror it to a macbook and take screenshot …)

So just wondering what would be the easiest way to trigger specific function in Vision OS Simulator by pressing on a key?

We’d also like to request that Input.GetKey or equivalent be made to work with the Vision Pro, so that an attached bluetooth keyboard could provide various test functions. We wouldn’t ship with those but it would be very helpful during development, to make sure things work before refitting to gaze & pinch & more UI.