Newbie here
I’m trying to understand, what determines system requirements for a game made in Unity ?
And once my game is done, how do I know exactly what are system requirements to run that game ?
Does Unity tell me that after I compile the game or do I run benchmarks manually or what ?
You profile your game. You can estimate performance based on your experience.
But for the most, you test on target audience hardware.
Either you test yourself, on weakest or target device you are going to support. Or you run playtest with your testers, to find out what you can support.
4 Likes
Yup, keep looking at the profiler while developing. Or at the minimum keep the Game window Stats popup open to see how many ms are spent on CPU and GPU. That gives you an indicator whether you’re GPU (rendering, postprocessing) or CPU (scripts, game objects) bound.
2 Likes
This. Under ideal circumstances you don’t just test afterwards, you pick a target early on and aim for it.
The target has two parts:
- Target hardware specifications. E.g. my game will run in an i3-5157U, 4GB RAM, Intel HD5500 graphics on Win 10. (Clearly not aiming very high with that example!)
- Target performance requirements. E.g. the game will run at 30fps >95% of the time, and never drop below 10fps except during loading.
For the target hardware I’d pick something which was easily available to me, so that I could build it into my regular testing. Performance is a feature, and you need to know when it’s broken. This has a couple of benefits:
- It’s far easier than making a game and then trying to figure out what it needs to run on afterwards.
- You’re likely to better optimise your game, because you have a clear target to hit, and can tell when you’re missing.
2 Likes