Hi Forums!
Complete this sentence:
A good frame-rate is __________ and anything higher.
Hi Forums!
Complete this sentence:
A good frame-rate is __________ and anything higher.
30 can be enough, depending on the game.
Edit: Oh, a friend of mine mentioned that he used to target 110-120 fps when playing counterstrike back in the days. It was supposed to make aiming easier, although i cannot confirm that it worked. For all practical purposes, 60 should be the target.
30 fps is about 6 fps higher than feature-length films run at. For games that don’t feature a lot of fast motion, it is quite reasonable.
60 fps is very close to the perception limit for most human beings. There are a few individuals with very sharp eyes that might be able to detect differences in framerate higher than 60, especially when their adreneline is pumping. But the vast majority of human beings top out around 60. For them anything running at 60 or above is going to look buttery-smooth.
I personally prefer to shoot for 60. I like my animations to look nice and smooth. It’s a peculiar affectation of mine. Capping a game at 60 is usually reasonable. It’s possible to have your game crank out 120+ just to show off, but it’s kind of a waste of system resources. While I can appreciate wanting to go for those kinds of framerates, I just wouldn’t want to push the hardware that much. 60 is a good target.
We aim for 30 on unsupported hardware and 60 on hardware which is being sold still. Alternatively, we aim for 30 on games which are very heavy on visuals and a slower pace. This is across all platforms.
30 is the hard limit, 60 is preferred.
Regarding old school fps games and 80/120 fps this is mostly because you’re fighting a CRT display and refresh rates, which is entirely different to the displays of today.
No, it’s not even close. Pretty much anyone can detect far more than that. 60 only seems smooth if you don’t have anything higher to compare it to. I find 120 a nice minimum, though it’s not always possible (e.g., can’t go over 60 on iOS hardware).
–Eric
30 is acceptable, 60 is preferable. If you are developing a game that will be played on a TV (XBOX, PS3, Wii, etc) you will want to try to get a higher frame rate if possible. 60 however is pretty much the standard.
A minimal frame-rate is 30 and anything higher.
A good frame-rate is 60 and anything higher.
A optimal frame-rate is 120 and anything higher.
Completely depends on the type of game itself. Some games are fine at 10fps, while others need 40-60 for decent play. What you really want is balanced performance, if you have areas much more demanding than others you’re going to notice the fps dipping much more.
It’s always preferable to make everything smooth if possible which really means 60fps. If it’s a non-scrolling non-3d-perspective-traversing game then I think 30 is okay, like a card game or something. But most likely it’s those kind of games that will run faster than that.
A lot of people are saying 60, which is definitely preferable if you can get it, but pretty much all AAA console games in the current generation use 30 as the target. If you think AAA console games look good, then 30 fps is probably fine. I’d say some more intensive effects are definitely worth the trade-off in framerate (as long as you don’t go below 30), but if you want a really smooth game with no hitches then aim for 60.
I’m targeting 60 fps for iPad2, iPhone 4S and above. 30 FPS for the rest. The difference is like day and night for me.
This!!
As far as human eye recognition people can identify random objects flashed at them in 1/220th of a second so perception is able to see at a very fast rate and its completely different than the traditional frame rate we see on cinema screens and computers.
So it depends a lot on the game and what is happening in it. Anything lower than 30 can get choppy if you have steady, moving objects but its fine if your game involves you staring at a brick wall, since nothing is moving and 3 frames of that per second may be fine. 60 is a good target if you have a lot of moving, fast action motion. Your eye can easily notice a difference above 60 but it will likely not be worth the sacrifices you make in the visual quality just to raise the fps.
Aim high if you’re designing on good hardware so that a larger variety of system qualities can play it fine.
Well, aiming for 60, as other have said, makes the game look very smooth. It also gives room for the game to potentially slow down. If the frame-rate suddenly halved, it would still be thirty. Fifteen frames a second is what a human can perceive as fluid motion, so aiming at 30 would also allow for considerable slowdown, but it would not be ideal, especially in intensive parts of the game. 120 would be nice to have, but is definitely not necessary. I would say aim for between thirty and sixty. Anywhere between those you’re safe.
That kind of test doesn’t say what you seem to think, a camera with high exposure time can also represent flashes of light of almost any duration, doesn’t mean it will represent distinct ones. Human perception of moving pictures is very complicated, and in games, 30 is often fine, 60 preferred, and 30 with good motion blur, generally better than 60. The brain’s ability to recognize ‘smoothness’ of motion is influenced by a number of factors, but distinct rendered frames that do not represent motion are perceived as such, and without motion blur that’s what games produce, distinct snapshots in time. In the ‘real world’, and in film, images are viewed with motion blur, but game images generally are not, which is why we are often targeting much higher frame rates than television and movies seem comfortable with. In short, we’re creating motion blur by drawing more frames, and that’s what people respond to. It does not mean they actually perceive the difference between these frames, it means they recognize the lack of motion in the frames they do see, and drawing more frames reduces the lack of motion.
I don’t know what any of you are talking about… If your game is running at 30 to 60 you are golden. Anything faster then that you are burning up your users graphics cards The human eye can only perceive up to 60 FPS
READ ARTICLE HERE
http://www.cameratechnica.com/2011/11/21/what-is-the-highest-frame-rate-the-human-eye-can-perceive/
I don’t think that is legit man, maybe I am superhuman or a mutant but I can easily tell the difference between 60fps and 120fps especially with motion blur on. 120fps is much more fluid and clear compared to 60fps when stuff is moving fast.
Also FPS caps to 60 on iOS anyway ^__^.
That’s completely untrue.
No, that just means you’re a normal person.
–Eric
The ideal framerate debate
has been going on for decades by the way… 60fps was considered the realm of the elite developers back in the 80’s/90’s while 30 or 15 was considered slower. Definitely there are some games that are pretty cool in 30fps that you don’t really care… but if there’s a lot of movement going on I think most people would prefer the crispness of 60fps if possible. That said I agree with an earlier comment that going to 30fps gives you double the processing power, which can be very attractive.