PC Games determining graphic settings upon initial run

When a player runs most PC games for the first time, the graphic settings are automatically set for him, according to a rough estimate of what his computer can handle. So if his computer is high end, and the game is relatively simple, the graphics would automatically be maxed out upon initial run. Though I do notice that sometimes games are quite off in their estimates.

So my question is how do the developers determine this benchmark? Are they actually checking the hardware or are they running some sort of test in the background? Anybody have any experience/ideas?

The island demo determines what graphics card the computer uses and adjusts settings based on that. I would have to guess that is how most games do it because it is probably the fastest way to read the player's computer's capabilities.

Island Demo