Jaggy Redraw

I’ve created a simple web player with a building, ground plane and the FPS controller. The objects are using baked textures.

http://homepages.wmich.edu/~abbott/tour/Tower.html

When I navigate around, especially when I turn the mouse to look left or right, I get serious jaggys along the edges and some that almost looks like a “scan effect” where even more pronounced jagginess seems to pass vertically across the view. I’ve tried different computers, browsers and qulaity settings.

Can anyone how to get nicer redraw than this?

thanks

Kevin Abbott

What “Quality” setting are you using in the webplayer? Do you have anti-aliasing on when using that setting? These settings are adjusted here: Edit => Project Settings => Quality

Or are you talking about the “shearing” that takes place when moving the camera?

Looks like either an anti-alias (did I spell that wrong, or get the name wrong… I feel like I did), or you’re object is really small (thus really thin). One possible easy solution is to scale everything to make it larger.

i think you’re referring to tear, try turning on “Sync to VBL” in your quality settings.

Sync to VBL seemed to do the trick. The new one can be seen at

http://homepages.wmich.edu/~abbott/tour/TowerV2.html

What exactly does that setting do?

Thanks for your help

Kevin

small bit of info here:

http://unity3d.com/Documentation/Components/class-QualitySettings.html

; )

It waits until the screen refreshes before drawing. The problem is that without this, drawing frames and refreshing the screen are done at different times. Which is to say, you normally pump out frames as fast as you can, but the screen redraws at a set rate. So it can happen that you’re in the middle of drawing a frame when the screen refreshes, so you see the top part of the current frame and the bottom part of the previous frame. This is particularly noticeable when you have large differences between frames (turning quickly in a FPS game, for example). If you sync to the display, then this problem disappears.

So why wouldn’t you just leave it on by default? Because the drawback is that if the framerate drops below the refresh rate, since you have to wait for the screen to refresh, this means the framerate is halved or even worse. i.e., if the refresh rate is 60Hz and you’re getting 60fps or higher, all is fine, but if you drop below 60fps even a little bit, then vsync forces the framerate to 30fps (and lower still if you go below 30fps at all). So you probably only want to use vsync if you’re pretty sure you can consistently maintain a high framerate, or you don’t mind the possibility of large framerate jumps.

Plus keep in mind different monitor refresh rates…if you’re using a LCD screen with a 60Hz refresh and normally get 65fps, you’re fine with vsync on, but somebody using a CRT with a 90Hz refresh rate and getting 65fps would be bumped down to 45fps instead.

–Eric

Thanks kwabbot for bringing this up and drjones / Eric5h5 for the link and exlanation as to why this occurs. It’s something I thought I had to live with.

I’ve got another question that may sound silly, why is 60fps needed at all ? Film/Tv are far less a frame rate. I guess it’s got something to do with the real time rendering of each frame ? Maybe motion blur would solve this anti-aliasing problem … is there such a thing as real time motion blur ?

Cheers,

JW.

They’re not interactive, though, so they can get away with having a low framerate by using techniques like motion blur. Actually 60fps is kind of on the low side; I prefer 90+fps if possible (one of several reasons why I prefer CRTs over LCDs). I gather motion blur is expensive to calculate, so it’s better just to have higher fps, and from an interactive point of view, the higher the fps the better anyway. Motion blur just makes motion look smooth…in games, you want it to be smooth.

–Eric

Try watching some 60 fps video some time. It’s amazing. Without high frame rates, when you turn the camera (especially in a video game), everything breaks apart and you miss a lot of visual material.

Yes, motion blur is exactly why 24 or so FPS on TV/film is “enough”. For each frame shot, the camera averages any motion during the frame time. Whereas in games each frame is just rendered at a time instant.

Yes, motion blur does solve that, but to get robust and quality motion blur you’d need to average lots (i.e. dozens and more) of frames for each displayed frame. That would mean rendering at 300-1000 FPS or so, averaging them down to 30 FPS.

There is Motion Blur image effect in Unity Pro, which does a simple version of motion blur - just blend in a bit of previous frames. This is quite fast.