Audio only playing from right ear speaker in Vision Pro build

this is a binded app

audio is only playing from right ear, any idea why? I pretty much have default settings on audio sources, and audio is going through the mixer, I have not adjusted any pan settings

P.S. I feel like I’m playtesting your broken tools for you, you should make polyspatial part of free Unity until you fix all the brokenness, it’s like one broken thing after another costing me days if not weeks of effort just doing builds over and over to figure out what isn’t working on the unity side


+1, its pretty not awesome to be charging us $2,000+ a year for a Pro License to build on AVP, when things are still meaningfully busted

1 Like

It looks like you’re hitting the same issue as others in this thread. I haven’t been able to replicate the issue on my end but we’re trying to find the root cause. Thank you for your patience as we work out these issues. We’re doing the best we can.

on further testing it seems like if I put my app on the left side of my head, audio only comes out of the right speaker, and if I put it on the right side of my head, it only comes out of the left speaker

it seems to be swapping the ear it should show up on as spatial audio

This is sounding more and more like a platform bug. The spatial audio in bounded apps isn’t under our control, and will always use the center of the volume (or maybe the middle of the bottom face?) as the source for audio. The unity spatial audio might be interfering with what visionOS is trying to do, but if it’s responding to the position of the window, that sounds unlikely. I’m importing your project from your bug report and I’ll see if I can replicate these issues.

on more testing I’m not sure if it’s correlating to app position actually, it seems to be totally random and I’m not sure when or why it switches, but it does sometimes switch

should I be turning spatial audio on or anything like that? right now my audio sources are just the default settings, with a 3D sound with logarithmic falloff

are there any system or platform settings I should change?

I posted a screenshot to the other thread showing an accessibility menu that lets you pan all system audio right and left. I doubt that’s responsible for what you’re seeing/hearing, but it’s the only thing I found that could be at fault. Are you able to try playing music/sound from Safari or other apps to see if they exhibit the same behavior? In that thread, I shared a file called that contains an Xcode project for a simple Swift app that plays audio. Can you see if you hear any issues with that one? I can probably tweak it to use a volume if you want to try that.

:crossed_fingers: I hope I can repro the issue with your bug project. That will be the clearest path to figuring out a solution here.

Out of curiosity, do you have airpods paired to your vision pro? Anything else audio-related that’s not just “I took the device out of the box and put it on my head?”

no changes to audio settings, audio coming from apps like safari etc… is placed correctly in ears

I just ran a build of your app and the audio sounds correct to me. I hear the background music come from the volume in the ear closest to it, and it sounds like it’s coming from the right direction as I move the volume around or turn my head. I’m on a device running visionOS 1.0.3. Can you let me know the specific OS version you’re using?

This can be found in Settings > General > About > visionOS Version. I need to know the version code next to the semantic version number since there are multiple actual versions of the same 1.x version displayed in most places.

Can you try building and running this app? If you hear the issues with this basic Swift app, it’s definitely a platform bug or a settings issue.

visionOS 1.0.3 (21N333)

Man, this is weird. I’m on the exact same OS version, running a build right from the Xcode project included in your bug report. I just noticed that folks in this thread are reporting that downgrading to com.unity.polyspatial.visionos 1.0.3 fixed their lack of audio. I wonder if it will fix your spatial audio issues, too? For the life of me, I can’t think of anything in package code that could change how audio works–it’s all internal engine code.

I’ll keep digging on this, but any more information you can provide will be super helpful. Please let me know if you’re able to try the Swift app I posted above, and what results you get.

I just tried your swift build it works fine

OK good to know. I’ll see if I can learn anything more from your Unity project. I don’t have much hope of fixing the issue if I can’t reproduce it, though. Something I noticed is that the test scene is using a different volume camera window configuration from the default. This is something we need to make more clear, but the default volume camera output configuration determines the “startup” configuration for your app. So when the default configuration in settings doesn’t match the one in the startup scene, we end up opening and closing a volume with the default configuration in quick succession when the app is opened. You don’t ever see this because the volumes fade in and out, but I wonder if that first volume could be “stealing” the audio and causing issues.

Can you try setting the default volume configuration in Project Settings > PolySpatial > Default Volume Camera Window config to the same asset that you use in the volume camera for the startup scene?

Also, unrelated to this audio issue, I noticed that your scene has an ARSession and you’ve set a hand tracking usage description. ARKit is not actually required for the gaze/pinch gesture to work, and furthermore it won’t work when you don’t have an ImmersiveSpace open (enabled in Unity by using an Unbounded volume camera). I don’t think this would affect the audio issue, but you should be able to remove the AR Session from your scene, disable Initialize Hand Tracking On Startup, and remove the usage description. In fact, you can even de-activate the Apple visionOS AR plugin/loader if you don’t intend to use ARKit features.

I think I finally have a repro! I still get sound out of both ears (if I cup my right ear, I do hear the music) but I’m pretty sure it’s louder in my left ear, and it’s not spatializing properly as I turn my head or reposition the volume. I’ll see if my suggestion above (sync default volume configuration with start-up scene) fixes it.

I think that worked? I’ll try running it again a few times, but the audio starts right up and seems to spatialize correctly.

I can confirm strange audio behaviour. I have an AudioSource on a GameObject somewhere in the scene with Spatial Blend on “3D”, but from both the panning and volume, it appears that the audio is emitted from another position in the scene than the GameObject’s position. (Or that the audio listener is not at the camera position… also possible of course).

In my case it solved the audio issue on main menu scene which is bounded volume. But when I load level scene (unbounded), audio still gets muted.

I’m gonna try this on level scene:

This solution works on every case in my game: