Sure. Mipmapping isn’t intended to keep textures sharp, it’s not part of the intent behind mipmapping at all. Mipmapping is only intended to reduce aliasing. i.e. prevent textures from having jagged edges or “having a sparkle” when displayed with fewer pixels than the texture itself has.
For example, here’s what a thin grid texture looks like with no mip maps:

And here’s what it looks like using trilinear filtering:

As you can see, it’s a lot blurrier. Especially the ground plane, though the scaling quad doesn’t look as bad. It is still blurry. But it no longer aliases, and that’s all mipmapping is designed to prevent.
For comparison, this is probably closer to what you want:

Those are all gifs from this article that goes deep into all of this:
https://bgolus.medium.com/sharper-mipmapping-using-shader-based-supersampling-ed7aadb47bec
And someone posted Sprite and UI shaders with the technique shown above implemented here:
https://discussions.unity.com/t/737314/8
Unity doesn’t do anything. What you’re seeing is all the default mipmapping behavior for literally all GPUs. And it’s not that it’s one step smaller, it’s that GPUs transition to the next mip before the previous one starts to alias. Displaying any texture at smaller than 1.5 texels : 1 pixel will start to alias, so it swaps or fades to the next mipmap before that.
Biasing can help sharpness as it delays the mip change, but will increase aliasing artifacts. Though some people prefer a sharp appearance with aliasing over the blur.
One important thing to be aware of, if you’re planning on releasing on iOS platforms. The mipMapBias does not work on any Apple device. You must bias the mip level in the shader, which is what the Sample Texture 2D LOD node does. The above shader also does the biasing in the shader so it’s more easily configurable.