What would you like in a `Unity 2D` game system?

I’m working on a set of 2D API’s and editors for Unity to make the creation of 2D games much easier. This system will eventually be available for purchase in the Unity asset store.

I would like to know what features you would like to see, especially more advanced/modern features.

So far the obvious areas included will be:
a) sprites
b) vector graphics
c) particle systems
d) tilemaps/scrolling
e) playfields/layers
f) drawing

It will also include some drawing tools/image editing tools.

Submit your ideas/wishes now then I can integrate them into the platform. :slight_smile:

g) box physics/collision system :wink:

So basically physics in 2D, right? Anything more specific physics-wise?

2d pathfinding and ai

isometric tiles
a real tilemap system with layers not “stack up tilemaps and hope” (to allow optimization on early out of tiles, batching etc)
Integrated tile atlas packaging basing on tilemaps

Yes, pathfinding and AI tools would be useful. Some kind of easy-to-put-together plug-n-play kind of modular AI system.

Dreamora, can you clarify a bit about what you mean? Like reducing overdraw? Or a specific isometric scenario where you have some kind of vertical volumetric tile system?

The tile/sprite atlas thing is no problem.

How about shaders for 2D?

What kinda shaders would 2d need?

Potentially the same amount as a full 3D game… 2D doesn’t mean unlit cartoony necessarily :wink:

We’re developping our own 2D framework here but I’ll tell you what shader I’d like to have : a shader with diffuse map / normal map / and Z-depth map with 24 or 32 bits precision enabling the dynamic lighting of a whole prerendered scene mapped on a plane. The shader would have to transform the relative position of the light into the texture ‘3D space’, apply normal mapping and handle ‘distance’ attenuation in this same texture space. Who said 2D has to be simple ? :smile:

I thought Unity would finally integrate a 2D engine, which would definately be one of the more reasonable features, for V3. Does it make sense coming up with your own version therefore?

It really depends on what they would do with it. The way that I want to go with it I’m fairly sure Unity is not going to do the same. They told me they are working on much improved 2d tools but I’m not greatly holding my breath for a truly next-generation implementation.

I think there are some interesting things you can do with shaders and 2D. Some of my ideas I’m keeping secret for now.

When you spoke of a shader with diffuse/normal/depth, do you mean like pretending that you’re looking at static view of a 3D scene and being able to move the lights around?

Considering how many games are still 2D, the importance of 2D and how much time it took them to recognize/care about this, i would be almost confused if they would come up with some crippleware. I mean the GUI alone needs a performance boost. Obviously the success of your module relys on what and when they will come up with.

Some shaderstuff like postprocessing or glowing would be nice but before that i would like to see the basics beeing covered in a elegant way like drawing primitives (lines, rects, polys, …), setting object colors, drawing textures/vectors/text(fields), blendmodes, collision tests(sprites, vectors), fast 2d physics like Box2D, playfields, spriteorders/-swapping, animations, tilemaps, fast pixmaps support, pathfinding&steering behaviors (nice to have), …

Yup that’s what I mean : prerendered CG with dynamic lighting. Though I’m quite sure it doesn’t register as a priority for the vast majority of people interested in a 2D framework ^^

With reducing overdraw I mean that if the tile system detects that an opaque tile is on a layer thats drawn above, the tile thats below is not generated on the mesh data at all so its never sent to the 3D API driver, a basic optimization that has been longer around than most game devs that callthemself such nowadays :slight_smile: (X-Com / UFO: Enemy Unknown and Transport Tycoon already in Intel 386 days on DOS)

As for shaders: Basic ones should be present. With basic I mean normal mapping and “visible volume” lights (fake volume light and shadows through penumbra extrusion shadows as described in the corresponding article on gamasutra from 2002 or alike)

It would be nice if we can drag/drop UI elements for game HUD. Something like what GameSalad is doing now.

These are all good ideas.

Even the idea of a pseudo fixed 3d scene with depth and lighting etc is something I’ve considered also. There are definitely some interesting possibilities with shaders, not just the obvious normal mapping etc. For example gpu-driven particle systems.

I think editors are a key part of making 2d development much easier. I am not sure I can really visualize how Unity would implement significant changes/additions to the current editor to provide a useful and focussed workflow for 2D games, and how to blend that in with the 3D features. That’s why I’m aiming for more of a focussed 2D-centric solution.

  • Resize/drag/pixel-fit 2D element on screen, in the editor, would be good.
  • Also a way to have layer.
  • Alignement base on an anchor, but also based on other 2D element ( for example the box A is stuck on the right of the Box B ).
  • Absolut size ( 2D box cover 1/2 of the screen ) or pixel perfect size ( 2D box cover 250 pixels )

That way you can do interface that adapt himself to all resolution, and can still be pixel perfect.

  • Of course the basic list/button/menu/grid etc… container.
  • And all that is proposed by the excellent itween, direcly include in the engine ( much faster than C# ).

This sounds very interesting. What’s your general approach to this? Simply set a resolution in code and call DrawSprite(x,y,…), DrawText(x,y,…) etc.? Or some sort of 2D in true 3D with visual editors? Personally, I would prefer if I can set a inner resolution and work in my code with fixed pixel coordinates. The user can of course select his resolution. A projection matrix takes care of everything behind the scenes. Plus the user can choose weather the result on screen is stretched or letter/pillar-boxed.

Text formatting (using different letters size and colors in one string, e.g. in tooltips) is very important for me. Bitmap fonts support would be nice, too.

+1 for the advanced shaders simulating pre-rendered CG with dynamic lighting

Yes there are two different ways to do coordinates, one is to treat the screen like in the current 3D Unity where you specify a range of coordinates that fit into the vertical dimension of the screen (or horizontal). Then no matter what resolution you run in the objects will always be proportionately the same distance from the edges of the screen, so e.g. supporting scalable gui elements and true resolution independence. One issue with this is that textures will stretch and objects will not always be positioned at exact coordinates. One enhancement to this perhaps is a hybrid between resolution independence and resolution dependence where you position pixel-dimension objects at resolution independent coordinates, with perhaps a few sets of graphics at different resolutions, so that you can get a proper 1:1 pixel mapping with some sense of scalability.

The other way is to throw resolution independence out the window and use pixel coordinates, which then makes your graphics shrink when you are in a higher resolution.

I also would like some kind of monitor-size calibration where you can say in real-world terms I want my button to be 2 inches wide and then somehow the user would input the size of their monitor and it would calculate, based on the resolution, how big to make the button so that it is 2 inches wide.

In terms of implementation I’m looking mainly at a 3-pronged approach.

  1. External visual editor application which can produce standalone runtimes to run on OSX/Windows and in the webplayer.
  2. In-Unity scripting API which also includes in-game runtime editors/tools, where Unity then builds the final application.
  3. Unity Editor enhancements.

Thank you for the details.

Considering the coordinates and resolution independence, the way Unity handles the GUI code is good enough for me: you chose an ideal pixel resolution (virtual resolution), work with pixels to position the GUI.Labels within the ideal resolution and then set up the GUI.matrix to scale to fit the users resolution. Instead of scaling the individual elements, scaling the final screen via a projection matrix can be more desirable. What I wanted to underline is the working in pixel space element (no working in 0-1 screen space, or real 3D world coordinates, etc.) If you can come up with clever tricks, how to use multiple sets of gfx, I see the usefulness (e.g. fonts: easy to generate hi-res sets and often lack sharpness when up-scaled)

The implementation point 1) is a little bit confusing: why would I need an external editor to build the game. And if the external editor is useful and has some key game building functionality, then lacking the ability to build for all devices (iOS, etc.) is a downside. But maybe I just got it wrong and you can do the same things with 2) and build from Unity to iOS, and use 1) just to be more effective if building to desktop/web.