Hi, I am currently working on a project which has taught me how to go about building and deploying a working Oculus app. It’s pretty much just a theatre application.
I’ve been handed a TBE file which to my understanding is an Ambisonic file type for spatial sound and I’ve been asked to deploy it in my environment so the spatial audio will sync up with the video. This audio file I was provided with were explained to have been recorded at the same time using a 3D spatial microphone with the video.
I’ve been told that if I can correctly interface the audio file into Unity, the TBE file will be a plug and play situation with audio ques like a person walking by correctly (er audio rendered?) positioned.
Any breadcrumbs on how to implement this? I have the people who built the TBE file not far away if suggestions are to change file format.