I’ve got one user with the strangest set of symptoms in Mini Micro on his machine. Basically, the Pixel displays (which are a PixelSurface — a set of Quad tiles using the Sprites/Default material) are simply invisible for him. He is my only user on a Windows 7 machine, so that is a suspected factor. But the collection of symptoms are so strange:
Any pixel display appears to simply not draw (appear transparent).
This is true even if I set all 8 layers to pixel mode.
However, the data is there under the hood; if I draw and then check .pixel, it returns the correct colors.
The display also returns the correct mode.
Sprites do show up.
No errors are found in the output_log.txt.
I really thought I’d find some errors in the log, but nope, it’s clean. The tiles are in the Default layer. I will add some code to verify that they are active in the hierarchy, but it’s hard to see how a logic flaw like that could affect only the one user on Windows 7.
I would think it’s some shader incompatibility with his older PC graphics hardware, but the tiles use the Sprites/Default shader, and other sprites show up fine.
The way PixelSurface works, it uses a texture if one is needed… but when a tile is all one color, it sets the texture to null and just sets the material color accordingly. I’ve tried both conditions, and it’s invisible either way. So it doesn’t look like a texture RAM issue. Besides, when he accesses Mini Micro on the web, the pixel surface does show up! So his hardware is capable of doing it, but it doesn’t work in the Windows standalone player.
I’m running low on theories here… any ideas what else I should be checking for? What could cause an object to simply not render on some machines?
I added some code to add a sphere object in the middle of each tile, with the material set to the standard default material, all default settings except for the color (which I set to yellow):
var ball = GameObject.CreatePrimitive(PrimitiveType.Sphere);
...
ball.GetComponent<Renderer>().material.color = Color.yellow;
And I hid all the other display layers, so there would be no issues with overdraw or occlusion or anything mucking with the depth buffer. On my machine, the result looks like this:
But on the Windows 7 machine, it looks like:
That looks like the Hot Pink of Broken Shader to me. And again, that’s using Unity’s standard default material. In the Inspector, it looks like this:
There are still no errors in the output_log.txt, but here is system info from the top of the file:
Would you expect the standard shader to work on this machine? Would you expect some error message if it does not?
I know Windows 7 is very old, but this user is important to me, and I really want to get my game working for him! Any thoughts will be very appreciated.
Not yet (the build-test cycle is rather slow, since it only occurs on the user’s machine), but that’s next on my list to try. Something really simple, like a mobile shader or maybe something custom. I know that some shaders work on his machine because he sees text, sprites, and “solid color” (a giant untextured quad) just fine…
It’s possible this is a red herring, since the PixelSurface quads I care about aren’t showing up as pink; they’re not drawing at all, as far as I can tell (as in you just see whatever’s behind them). But maybe that’s just a different kind of shader failure.
I don’t think Windows 7 by itself isn’t the issue, I suspect …
… that’s the issue.
That’s an integrated graphics chipset that’s at best 12 years old, and potentially older depending on which version of the 965 chipset is being used. Some of the GMA chipsets from that family can run Direct3D 10.0, but most are limited to Direct3D 9.0, and some don’t actually even support that and just emulate support via software. Unity dropped support for Direct3D 9.0 in Unity 2017.3, and in doing so also dropped support for Direct3D 10.0 as they never officially supported it, only Direct3D 11. As such a lot of shader probably don’t include appropriate fallbacks on standalone Windows builds.
WebGL shouldn’t have this problem, since those shaders are being built for effectively OpenGL 2.0 level hardware. I’m curious if the web player works for him. Unfortunately there’s not really a solution to this for standalone since you can only build to OpenGLCore for standalone builds which the graphics chipset he has will most certainly not be able to run.
However, in general if you stick to 2D, mobile, or “legacy” shaders, I would expect them to run on his hardware.
Hey thanks, that’s good info. Curiously, the WebGL build fails differently — the PixelSurface shader I’ve been tearing my hair out over works fine, but the sprites (which use a custom shader) do not. I guess this all fits with the chipset being old and the shaders not having the right fallbacks.
I think this is something I can work with, though, as my shader needs in this project are very simple — I’m going for a flat, retro look, all 2D, no lighting, no post-processing, no anti-aliasing, etc. Just shove some color onto the screen please, exactly as it is in the texture. So I will try those mobile and legacy shaders, or some dead-simple shaders myself, and if it works fine for this guy then I bet it will work fine on newer/better systems too. (And the wider the reach of this project, the happier I am.)