Hi everyone.
I’ve imported a model from Blender, with texture baked on.
When I’m 5 meters away from the wall, the painting looks blurry, but up close it looks great.
Any suggestions about what I might need to research here?
Any setting suggestions?
Thanks!
p.s.
see image 1 blurry, then 2 up close good
To explain @BattleAngelAlita and @peopleUnity 's answers more, it looks like you have a image that’s been resized to a square texture that is then stretched vertically across the geometry to get back into the original aspect ratio. This causes some issues for GPUs when picking the appropriate mip level.
To explain, a quick explanation of mip maps (or Pyramid maps as they’re sometimes called outside of real-time rendering). When an image is displayed on screen smaller than its original size, each texel of the original image becomes smaller than the on screen pixel. This means each on screen pixel only gets unconnected texels of the original image causing the image to alias, or flicker, with some details disappearing completely only to pop back in as the object moves across the screen. To avoid this you need to get all of the original texture’s texel colors within the area the on screen pixel covers. This is expensive, so instead we use mip maps, which are successive, half sized images that are averages of the previous images texels. Basically, what you get when you scale down an image in Photoshop or Gimp by 50%. This means instead of having to sample the original full size texture multiple times, it just has to choose the closest pre-scaled image and sample that.
How GPUs choose which mip map to use is by calculating how many texels per pixel an image would be displayed at along both the screen’s horizontal and vertical, then using the largest value of those two calculates the appropriate mip level so that it’s closer to 1 texel per pixel. When it comes to images that are being displayed stretched out along only one axis, one of those directions will have a much higher texel to pixel ratio than the other, and cause the GPU to drop the mip level sooner.
A couple of options to improve this:
Disable mipmaps on the texture asset… is the advice you’ll find all over the place if you search online. Do not actually do this. It will reduce the blurring, yes, but at the cost of heavy aliasing. Try it for yourself if you want to see the horror.
Make sure the original image is being displayed with a 1:1 aspect ratio. If you’re manually scaling the images into square textures, don’t do that. If you’re not then Unity might be doing this for you as by default Unity will try to scale images to the closest power of two size (which exists because of mip maps), and some image formats (specifically those used by iOS devices) can only be square power of two sizes. If this project is for desktop only, then you can tell Unity to skip scaling textures to power of 2 dimensions as modern GPUs can handle non-power of two textures. Otherwise you may want to manually increase the canvas size of the textures you’re using so that your textures aren’t being scaled down, even if it means having wasted empty space in your texture, or use an atlas and put multiple textures into a single larger power of 2 texture. Keeping your textures in power of two sizes, and 1:1 aspect ratios to how you plan on using them is good ideas in general when doing real time rendering in general, so keep that in mind.
Increase the texture bias. This one is a little annoying because there’s nothing built into the Unity editor’s interface to let you do this. You can write custom editor code to change the value, but personally I find it easier to use the Debug inspector and modify the value that way. Right click on the Inspector tab and select “Debug”, then go down to Texture Settings and set the Mip Bias to a negative value in the range of -0.5 to -1.0. Like disabling mipmapping all together, this can increase aliasing to some degree, but not as significantly as mipmapping is still being used. You can swap back to the normal inspector by right clicking on the tab again and selecting “Normal”.
If your textures are using Bilinear filtering, the change between mip levels is abrupt, and can make this much worse. Using Trilinear filtering can improve this significantly as it blend between mip levels, but there will still be some blurring. Anisotropic filtering, controlled either by the Quality settings or on each texture using the Aniso Level slider, can help as well. Specifically with reducing the blur of textures that have been stretched like this. Try setting your Anisotropic Textures setting in your quality settings to Forced On. This helps a lot with ground and wall textures too when viewing them at oblique angles.
@peopleUnity and @BattleAngelAlita thank you for your replies
For some reason I’m not getting email notifications (despite clicking the box) so I’m just seeing this now.
@bgolus - thank you SO SO much for the full and generous reply. That gives me a really good sense of the problem, and may of the potential causes and fixes. Seeing as you really know your stuff, let me give you the quick overview of my workflow, and maybe you can point me in the right direction regarding your earlier advice?
I’m working on a mobile VR thing, so the idea you mention about modern video cards on PC being OK is probably out of the question.
The texture / model you see there, is a ‘UV / Texture atlas’ baked out of Blender, for all three of the images and the frames too. In Blender I UV unwrapped all separately, and then the ‘Bake Tool’ add on combined all UVs into a 1024x1024 texture, which I dragged onto each of the 6 objects (3 paintings and frames). Given this info, do you think the issue is on the Blender side of things / baking, or should I look at the settings you mention in Unity?
If you’re already importing a 1024x1024 texture, then the issue isn’t on Unity’s side. I’m not a Blender user, but I wouldn’t expect a texture auto atlasing tool to be the problem. That said it could be. Look at the original texture’s aspect ratio and the aspect ratio for the area that texture covers in the auto UV’d atlas. It’s going to be squished in one of those two (or both) compared to how your displaying it.
My totally off the cuff guess would be you’re using scaled quads for the pictures, and the baking tool’s auto UVing doesn’t take into account the mesh scale when determining the layout. Try resetting the scale on your meshes and make a new atlas.
Maybe? Depends on what mobile VR platform/device range you’re aiming for, and what else you’re doing in your scene. Ideally every VR application out there should be using 4x MSAA and at least 2x or 4x Anisotropic filtering, but it does come with a but of a performance cost. However, it should be noted that Wikipedia page’s comments about performance date from 2007, and things have changed a lot since then. For one, modern GPUs (desktop or mobile) do not implement anisotropic filtering exactly the same as the reference implementation, which is very expensive on mobile. Everyone uses some form of approximation, though exactly what they do is considered trade secrets. The result is much higher quality texture filtering at much less cost, especially for the higher anisotropic levels.
My suggestion when it comes to using anisotropic filtering in mobile VR is to try it, even if you aren’t having the above issue.