A few general shader questions for iPhone development

  • I’m curious to what kind of shader features do the iPhone support. Can we do things like:

  • Dynamic lighting (per vertex) - if so, what’s the limitation of dynamic lights on the screen?

  • Vertex animation on meshes? Eg: swaying flags, banners, cloth, etc. Is this done as mesh morphing or through the shader?

  • Alpha test transparency for materials (1 bit alpha)? Or do we have to use old-school pallettized, indexed color 0 transparency?

  • I plan to use environment mapping to fake reflections (I was thinking old-school spherical environment mapping - Is there anyone who’s got such a shader by any chance?)

And finally…

  • What would be a good average number of vertices on the screen?

Any response would be greatly appreciated :slight_smile:

Yes.

I actually don’t know if there is a hard limit; I don’t think so, but your framerate will plummet to uselessness before you’d hit it, if it exists.

You can’t do it in the shader (pre-3GS is fixed function only, and OpenGL 2.0 capability for the 3GS has not yet been enabled in Unity), or with morph targets. You have to use bones. Which is fine, because morph targets would surely kill an iPhone game.

There is no option for 1-bit alpha. Alpha testing is a little more dynamic than that:

However, alpha testing is actually particularly slow on the iPhone; it’s a quirk of the hardware. Alpha blending is faster. Use it whenever possible instead.

I don’t even know what that means, so I doubt Unity even does anything like that.

There’s one in the iPhone Standard Assets, “iPhone Lightmap Reflective”, that may be up your alley. With it, you can use the alpha channel of either a texture or the vertex colors to control reflectivity, and it uses spherical environment mapping.

I don’t even bother with this. Some people will toss in their numbers, which I’ve seen range from 7,000 - 20,000 or so. But I don’t think it’s worth considering (much). General rule of thumb is, use as little as possible, because with the stuff you’re talking about, you’re going to quickly run out of horsepower.

Awesome, thanks for the reply Jessy. That more or less cleared up everything I was pondering about. :slight_smile:

Interesting to know that full alpha blending actually is faster than alpha testing on the iPhone. :slight_smile:

Overall I’m happy to hear that it seems better than I expected.

Oh actually, here’s a couple more… :slight_smile:

  • How much texture/graphics memory is available?

I’m doing an arena based game. My plan is using multiple, tileable textures so that I can re-use these between the arenas.

  • Average texture size? 128x128 a good size?

  • Or… would be it be smarter to unwrap the whole arena into less but larger texture page(s)/materials (less draw calls)?

The Unity manual seems, to me, to suggest however much of 128/256 MB is left, after resources have been allocated for everything else, but this thread seems to place the first-gen devices at 24 MB:

http://forum.unity3d.com/viewtopic.php?t=29884

In actuality, 24MB might be more than is really available, anyway. Apple isn’t too good about clearing out the memory on these things, and they disallowed anyone else to make apps that do it instead. (Fortunately, I grabbed one of these apps before that happened. :P)

Pre-3GS allows up to 1024x1024, 3GS allows for 2048x2048. Personally, I haven’t dropped below 256 on anything, because I can’t stand the extreme ugliness, but you can also get really close to the objects in my current game. If your texture is low-contrast, you can get away with using PVRTC 2bpp compression. The dithering on the alpha channel tends to be rather unusable for RGBA, though, in my experience.

I’d probably have to see a mockup of what you’re talking about to make a good recommendation. Here’s a tip, though: if you can use the same texture and shader on different meshes, do it. (In other words, one material for multiple objects, to make batching happen.) If you have iPhone Advanced, Static Batching will work awesome with that. Dynamic Batching will also help, but not to the same degree.

I don’t think I’d go about designing the game around this, per se, but it’s nice to have in the back of your mind, “I’m going to focus on having a color scheme, not only because it makes for good art, but also because it means I’ll be able to take better advantage of batching.” So make the models, texture them (roughly) at high res, and then, when you figure out how much resolution you need, after a little play testing, combine your like-colored textures into a big sheet, if you don’t need the full 1024. The Photoshop (or GIMP, etc.)
file will be bigger than necessary, but Unity automatically samples it down, as you choose. Put the details in at a level that actually makes sense, for what will be used in-game.

Pre-3GS, you can use two textures per draw call. However, if you’re blending colors in a 1-bit alpha, blocky type of way, you might as well just use two meshes. I haven’t seen any evidence that it’s worth combining things just to save a draw call, if the shader has to do the blending work that could be avoided. The 3GS can blend up to 8 textures, though, in a single pass, which is pretty cool, if impractical. It is enough of a difference that it can be worth writing a SubShader for each class of device, though:

http://www.unifycommunity.com/wiki/index.php?title=Blend_2_Textures_by_Lightmap_Alpha

http://answers.unity3d.com/questions/1773/whats-faster-two-draw-calls-or-a-black-white-alpha-blend

Each new vertex light affecting the object gives you basically a new pass so the answer is: you better stop at 1

Bone skinned animation and its done through VFP.

what Jessy mentions is only partially true. I’ve been toying with the softbody script from the wiki for example on my iTouch 1st generation and its still somewhere fluent in the falling cloth scene. Its clear you can’t use it to add some fancy eye candy just for fun, but depending on what you need it might still be an option.

Cutout shader are the magic thing here.
But be aware that the iphone is much faster at alpha blend than at alpha test, as the hardware is optimized for early z tests to reduce overdraw and alpha tests just don’t fit into that

10k to 14k vertices / triangles
What you normally hit far earlier is the draw call budget though which is <= 32 on pre 3gs

And yeah, the pre 3gs devices (including itouch 2nd generation) have 24mb of VRAM so 22mb available after screen buffers, the 3GS one basically can use the whole RAM as VRAM if needed.

What do you mean? That it’s just slow to add multiple light calculations?

(Unlike with pixel lights, you don’t get a new draw call when you add another light.)

A test with a bunch of objects and 4 lights shows only a minor difference compared to 1 light. As Jessy said, we’re not talking pixel lights here.

–Eric

Sorry I should have mentioned: If you don’t enforce vertex lighting on the light sources.

And the impact of vertex lights commonly still is not 0 but with no poly objects its not obvious. (to make vertex lighting not look ugly you need enough polygons to get acceptable light distribution and interpolation)

There isn’t much choice on the iPhone…it’s vertex lighting or nothing. You only need to care about having lots of polygons if you want to make something like spotlights or point lights; otherwise standard low-poly models are fine with directional lights, which are fastest anyway.

–Eric

Obviously I’m making sure the topology is built in a way that it works well with vertex lighting. The topology will be “grid based” to make sure the vertex light distribution is even.

When it comes to dynamic lights, I plan to use 2 lights at most. One directional light or point light for the environment, and another point light is reserved for the visual effects.