I have 2 identical game objects with a sprite renderer and identical textures placed next to each other. Each texture is 32 pixels, and I’m using an orthographic camera set up to display a 32 pixel sprite as 32 screen pixels (The camera sets orthographicSize based on screen height and only moves in pixel increments. AA is off, filter is set to point, and wrap set to clamp). I’m not using mipmaps.
If I double the orthographicSize to zoom out to 0.5x zoom so that each 32 pixel sprite is rendered as 16 pixels on screen, the sprites on the 2 game objects look different, despite being identical.
I get that there are artifacts caused by scaling the 32px image to 16px, but what I don’t get is why each one has different artifacts.
How does unity handle scaling down textures when displaying the smaller than they actually are? Does it scale the whole screen down, or does it scale the textures down independently?