How to detect hardware to set quality?

Hello!

We’re making a Windows Store App game. We’re used to iOS development. Since there are only like a dozen iOS devices, it was easy to set our quality based on which iOS device was being used. Eg: iPad 1 vs. iPhone 5s.

However, we can’t do this for all the different types of Windows 8 computers. Eg: On my development machine, our game runs smooth as butter. Inside a VM with 1GB of RAM and no hardware graphics acceleration, we drop to under 10 frames a second.

How can I detect things like:

  • Installed graphics card.
  • Video memory available.
  • Amount of RAM installed.

Thanks!

Take a look at Unity3D’s built-in SystemInfo;

That’s exactly what I was looking for! Thanks!