3D Menus In-Game

I’m trying to figure out the logic of a mechanic I was interested in toying with. Perhaps someone is familiar with it. Quake 4 had a minor mechanic where the user would interact with consoles throughout the game world without breaking out of FPS control, the crosshair would change to a mouse cursor and select options on the computer model’s gui, and do general things like open doors, turn on things, etc.

Anyone have any ideas on how this might work?

I believe the EZGui library (I think that’s what it’s called) offers this functionality out of the box. If you want to do it yourself though, it shouldn’t be too hard.

I can think of at least a couple of approaches you could take to this:

  1. Make the interactive in-game controls individual game objects with colliders, and use physics raycasting to determine which of them (if any) the player has clicked.

  2. Place the in-game controls on a polygon or other surface. Raycast against this surface and perform an inverse transform to yield a 2-d local-space pick point. Test the pick point against the local-space bounds of the controls on the surface (which you would have to define). Take the appropriate action based on which control is clicked.

Thanks Jesse,

I bought the library, and I’ll give it a whirl.

EDIT:
Also stumbled on this thread. This also covers the same concept.