I wasn’t for sure this belonged so if this isn’t the place for this question feel free to move it and I apologize.
I’m working on a project where the lead developer wants something like a virtual desktop within a Unity scene. To expand on that, it would be a real computer within the game scene, like the person could access their email or open a web browser or a command prompt or something like that. Has anyone tried something like that? I’m just looking for some initial guidance since I’m not sure where to begin. Thanks!
If you’re going to manage the computer’s interface in Unity, the easiest solution is to use a 3D GUI product like NGUI, Daikon Forge GUI, or EZGUI. Just position that interactive GUI on the computer model’s screen.
You can do the same thing, with significantly more effort, by essentially recreating what these GUI systems do. Set up an orthographic camera that renders a specific layer. Outside of your regular scene, add quads, colliders, and the like, within this camera’s view area.
If you want to render an actual remote computer onto the in-scene computer model, you can use Unity Pro’s Render Textures. In this case, you’ll update a Unity texture with the contents of the remove computer’s screen. Input, on the other hand, depends on what kind of interface the remote computer expects. If it’s a text terminal, you could send all keyboard input to the remote computer. If it’s mouse input, you could translate Unity gameplay screen coordinates to coordinates on the computer model’s screen, and then send those to the remote computer.