What's the name of the effect that has been applied to these texures?

Hello. So I was snooping around in the assets of a game called Voxel Turf and found that its textures were stored in an atlas as I had thought, but there appears to be some strange padding around each texture, and I’m wondering what exactly they have done to achieve this effect.

Looks like they took each texture and stretched the outer most edge pixels to fill the space. Not entirely sure why they choose that particular solution to fill the padding, using XNormal’s photoshop plugin dilation filter would have been easier and been better for visuals. Better yet would have been repeating the texture.

@bgolus

I just assumed that they had found some clever way to do it. So since I’m dealing with pixel textures, would it be easiest to just go ahead and take a minute to make padding myself?

The first part of the picture is a 16x16 brick texture that I made, the second part is the padding, and the third part is the texture plus the padding, which is 20x20 in size. As you can see, I just took all the edge pixels on every side and extended them out by two pixels.

Yep, that’s more akin to what your usual dilation filter will do. It’s better than what they did for handling texture filtering on the edges. As I said before, the best option is to repeat the texture. If the intent is for the texture to look like it’s tiling, then your best option is for it to actually be tiling in the atlas’ padding too. Make a 2x2 grid of your texture, then offset it by 1/2 the tile size, done.

I also wonder if Photoshop’s content aware fill would work here…

1 Like

Oh man yeah, I can see how great repeating would be for brick and grass and stuff, but what about textures for a furnace or workbench or anything that needs to look a specific way?

@bgolus

I’ve followed your advice and tiled as well as inset each texture in my atlas by eight pixels on every side, but the padding doesn’t seem to be working at long range and is even going so far into the texture beside it, that it is penetrating into two more textures beside that one as well. I simply don’t understand how something like this can be happening.

Here’s the thing. The padding does two things. Mainly it helps prevent obvious seams at the edge of the texture, especially when using bilinear or anisotropic filtering, and it helps delay, but not prevent, issues stemming from mip maps. The easiest solution is to disable mip maps for your texture, but that’s really ugly.

In custom engines (ie: Minecraft) the atlas texture’s mip chain is clamped so that the smallest mip still has at least 1 pixel per tile, maybe even 4 pixels per tile if they’re using texture compression, rather than the usual which is for the smallest mip to be a 1x1 texture for the entire atlas. Unity doesn’t offer any control over the number of mips in each texture and always builds a full chain, so clamping the mip level has to be solved within the shader.

There are several threads on the topic of proper atlas handling here on the forums. But you can search for tex2Dlod and/or tex2Dgrad to help find them.

1 Like

That’s disappointing. I wonder why Unity doesn’t offer support for controlling the number of mipmaps in a texture…

Because some platforms (and potentially texture formats?) don’t support it, and will refuse to display the texture without a full chain.

Refuse to display? doesn’t that mean that even if I try to do it my own way with a custom shader it still won’t work?

Doing it in a shader is fine. That’s how you have to do it in Unity if you’re going to use an atlas. That or ignore it and don’t do anything, which some people do too.

If the texture itself has mip maps enabled, but doesn’t have a full mip chain, some platforms will simply not allow the texture to be loaded. The result would be an error and either a corrupted texture displayed, or possibly your application not rendering at all. For a long time Unity took the stance that if a feature didn’t work on all platforms, it wouldn’t offer it at all. That’s been relaxed pretty heavily, but using a partial mip map chain is a fairly uncommon thing. I also don’t know if any modern systems that Unity supports has this limitation anymore, OpenGL has had support for it since OpenGL 2.0 (including ES), and Direct3D has had support since 10. Direct3D 9 didn’t support it, any they only just killed support for that at the end of last year in 2017.3. But, it’s likely not been a high priority to add, and I doubt many people have asked for it if anyone.

Again, fixing it in the shader works just fine, even if it’s a little extra work.

1 Like

Because shaders run on GPUs, and GPUs don’t run C#, they run shader code which is written in HLSL or some other shader language.

[quote=“bgolus, post:11, topic: 713473, username:bgolus”]
Again, fixing it in the shader works just fine, even if it’s a little extra work.
[/quote] This post says that the only way to have a custom Mipmap is to use a prebuilt one in the form of a DDS file.

But you don’t want custom mipmaps. The mipmaps Unity generates already have the correct colors for what you want. You just want to clamp the max mip level, which you have to do in a custom shader.

Custom mipmaps are for when you want the contents of the mipmaps to be different than that of the full resolution texture. Like if you want each mip to be a different color (often used for mipmap visualization) or otherwise change their content for some kind of effect. Super Mario Sunshine’s water is a prime example of creative use of custom mipmaps, but one only relevant in a world of fixed screen resolutions.

Also, as far as I know, importing in a DDS with partial chain of precomputed mipmaps just results in the smallest mipmaps beyond what’s in the DDS file being black. That’s not what you want either.

What exactly is it that I’m clamping? I searched for a node that has mip or mipmap in its name, and nothing came up.
EDIT: I’m starting to think that Sample Texture 2D LOD might be what I’m looking for.

The mip level, or mip LOD. (It’s the same thing, just different people refer to it differently.) If you’re using the new Shader Graph to make shaders for the LW or HD pipelines, you’ll have to use the Sample Texture 2D LOD node. It’s fairly easy to calculate the appropriate LOD for a texture in HLSL or GLSL, but Shader Graph hides some of the values that make it easy so you have to bodge around that.


That’s a recreation of the mip map level calculation that GPUs use as detailed here, along with calculating the smallest mip from the texture resolution and atlas tile dimensions value (how many tiles across, not the individual tile size). Make special note of the AtlasTexture_TexelSize property and its reference name. That’s something that Unity automatically sets when used as a material, but which the Shader Graph preview doesn’t handle properly. The XY values are 1 / texture resolution, and ZW are the texture resolution.

I copied all of that into my shader graph, but it doesn’t even seem to care what I set any of the input values to, and it still doesn’t use Mipmaps.
EDIT: I deleted the UV and LOD connections to the Sample Texture2D LOD node, and it doesn’t make a difference.

What do you mean by “doesn’t use Mipmaps”? Did you turn off mipmaps on your atlas texture asset?

No, Mipmapping is enabled on the texture. Mipmaps work fine with Standard LWP shader, but they’ve been completely disabled with any custom graph shader I’ve tried to use.

If you use a Sample Texture 2D node it should look exactly like using the built in shader. If you use a Sample Texture 2D LOD with nothing connected to the LOD input, it will indeed disable mip mapping at that node requires you supply the mipmap level yourself. The graph I posted should have done that, along with clapped the mip level to prevent it from getting too small.

It’s possible I messed up somewhere in my clamped mip level calculation.

If you take the maximum of the two dot products and pipe that directly into the log node, it should go back to behaving almost exactly like the normal Sample Texture 2D node.