I am in the process of building a game that has quite a few GUI elements as part of the game play. The game screen has the following regions:
- A toolbox to drag objects from. These objects are animated sprites, and, dragged and dropped during gameplay.
- A “play area” to drop the objects, manipulate and animate them, before going to next level.
- A configuration area to manipulate the properties of the objects dropped in the “play area”
The main problems I foresee are:
- Animated sprites are created at runtime and added to the play area DURING game play; and these objects become part of further gameplay.
- Support drag and drop objects (animated sprites) from one place on a scene to another place on the same scene, move, change their appearence, and resize them
- Runtime creation and placement (on the scene) of a lot of complex GUI objects such as:1. Check box, 2. Edit box, 3. Combo box whose contents are dynamic, 4. Label 5. Image Button 6. Radio button 7. List box 8. Html link button (9) Multiline edit box (10) Menus and submenus.
- When an event happens to any of these GUI elements that are created at runtime, or to a dropped object, an event should be fired that can be handled in the game code. For example, when someone selects a check box, an event should be fired with all checkbox-relevant info. In the code to handle this event, we should able to handle this checkbox click.
- Change the appearence of an animated sprite (making them appear “inactive”, “active”, “selected”, etc . . . )
- Zoom and pan of the scene using a zoom button or an zoom-area-selectin tool
Any ideas on what is the best approach/design/architecture for this game? Where should I do most of the coding/programming ? , . . . . Please ask for more information if needed.
Thanks a lot.