I am making an psychology experiment game using Oculus and I need a GUI to control events in the game.
Besides creating 2 clients (game and control panel) and connecting via LAN, is there other simpler way to achieve this.
Ideally I would have liked to render the game camera to the Oculus and render another camera with the GUI to my laptop screen but I read somewhere that that is not possible.
If you were intending to do it all on a single client then you would have to use a portion of the screenspace for your control screen. I don’t think there is provision to send a smaller portion of the screen to the Rift so this probably wouldn’t work. It might also be an issue with the projection matrix size and how that is figured out, like if its literally just 50% of the screen per camera or if its dynamic based on your camera screen usage distribution. You’ll have to experiment with the two camera screen position/size options and see how the Rift reacts.
You could use the dark edges of the screen that are almost entirely not seen by the wearer of the HMD or just script hotkeys on the keyboard to do specific events in the scene so you wouldn’t need any screen space.