Hi,
On my first game barnyard bounce (fixed function/3G) I ran a series of tests (this was my own engine in c++ for ios) and noticed early on that the larger the texture size, the slower it would run, provided I was drawing the full uv extents of the texture (ie a 1024x1024 on a 512x512 quad) in 2D).
However mikamobile explained to me that it really isn’t the case, and never has been. What was I doing wrong on my old engine for this to occur? And do I need to ever worry about it in unity?
If I don’t need to worry why the hell would anyone use blurry or low res textures in games? I see it so much.
Just a thought, but… Because they want to keep their binary size small? Or they only made their app for non-Retina iPhones? Or perhaps they just suck.
Hippo, have you tried testing out an extreme example of this in Unity? Like, a series of a few dozen cubes with different textures… try them all at 2k vs. 32x32 pixels and see if there’s a significant difference in frametime via the internal profiler? It makes sense that there SHOULD be some difference, and I’ve never really pushed it that far. It’s just been my experience that reducing texture sizes a step or two in Unity (like 2k down to 1k or even 512) has no appreciable effect compared to so many other things, but my own games have admittedly been less-than-worst-case.