I am building an open-source head rotation tracker with a teensy board and a GY85 with an accelerometer, gyroscope, and magnetometer. I am using Max/MSP to send the angles with Euler or quaternion. What I would like to do is take the rotation of my head to control the rotation camera of the player to move around like VR but in 2D modality (screen/monitors)
To be honest, what I really need is to program the audio listener (or the appropriate script) to be sync with the head-tracker. The aim of the project is to be able to move your head to locate better sounds in 3D, as we do in real life. I do expect to find a certain reduction in the front-back confusion and improve the accuracy of elevated sounds using this theory. At the moment, it works with the camera, now I need to target the “ears” of the player, to be controlled by the head-tracker. I hope this has sense and gives you an approach to what I am trying to achieve.
I could make work to move the rotation of the camera with the head-tracker angles, however, I just need to be able to move the “ears” to be able to move just the audio listener and not the camera, otherwise is very awkward.
Thanks in advance,
Regards.