Scaling your game features depending on Android model

Hello,

We have a game that adapts its features (number of objects on screen, particles systems, etc) depending on how powerful is the device.

Doing this on iOS is easy because there’s the iPhone.generation flag.

But in Android there’s no such a flag. What’s the best approach then?

  • Everything under the SystemInfo structure seems too coarse to be useful.
  • SystemInfo.graphicsPixelFillrate could be a good alternative, but it always returns -1.
  • Implement ourselves a pixel fillrate + CPU mhz calculator? I’ve tried to find something already done (in the wiki and in the asset store) but it seems I can’t find anything. If well implemented, this could be a good product, right?

fffsssssssssuuuuuuu… rolling tumbleweed…

Is anyone interested in the fillrate + CPU calculator at least?

Honestly, the Android board is much slower/less active than the other boards, such as the scripting board. Just an FYI and a tip that if you really want others’ opinions, you can post there as well.

Something that might be a reasonably quick approach is to let the device run a set of scenes the first time the game runs, tracking the fps and then trying to make a judgement? at least that will give you a start on performance measuring as i’d know no other reliable way to get this info from within unity. But i’m not a really experienced android dev either :slight_smile:

MDragon, thanks :wink:

Annihlator, yeah, that sounds like a good idea. We may run a company or game logo for 2 or 3 seconds only the first time you play, designed to be slow in slow devices, and get the average fps from it.

Or make it user definable, which is what the PC does. Have option sliders for detail, special FX, whatever. Easiest way and let’s the user choose eye-candy vs smoothness. My interest is how to swap in better shaders on better hardware.