So, I've done some sniffing around, and I don't think anyone else seems to have this issue;
A shader I'm using (REPostProcessor/DreamBlurEffect) to blur the screen works perfectly fine in the Editor, I see it, I have access to it, etc.
But, when I make a full build and play it externally, the blur effect doesn't work. Is it possible for a shader only to work JUST in the Editor, and not work at all in a fully compiled game? The shader is there, I still have access to everything, so it's still tangible. It's effects are just not seen, is all.
I had a lot of problems with some screenshaders not working, but I managed to fix it after a combination of problems. My shaders would work in the editor but when I built the standalone game they didn’t appear at first, and then once I’d “fixed” that problem my screen was black.
Firstly, if your shader doesn’t appear at all when you build the game, it might be because Unity can’t find it. This will happen if you use Shader.Find() to locate your shaders. Put your shaders in the Assets/Resources folder to force them to be included. Alternatively modify your script with a reference to the shader itself instead of getting it with Shader.Find().
As a workaround, you simply need to render your effect to an intermediate buffer and then Blit to the destination without a material. It only takes a couple of extra lines.
I hope this helps anyone looking at this problem in the future!
Actually, I’d try turning of the Anti-Aliasing. Someone where I work discovered that the shader I was having troubles with works if you lower the quality settings on the build, so I went through the settings, one by one, changing them from their defaults on ‘Beautiful’ quality to those on ‘Good’ quality, and the one that made the shader work was turning off AA. There’s still issues, as I’m getting tearing on the edges of the polygons that are using the shader, but it’s something to start with, at least.
Original Message:
I’m also having this problem with Unity 3. Shader works fine in the editor, but not on build. No crashes… it just doesn’t render. Render settings on build are identical to the editor, etc., and I can’t find any reason for it breaking. I wish someone had a real answer.
Just found an answer to this! I had the same problem where an unreleased Voxelform shader would work in the editor, but not when running from the player.
The offending line is: _NoiseVal1 ("Noise Value 1", Float) = 1257787.0
Strangely, changing 1257787.0 to a smaller value such as 125778.0 fixes the problem.
It also doesn’t matter whether this property is even being used. So the cause wasn’t somewhere else in the code. Wonder what’s going on here? I’m guessing this won’t fix everyone’s problem, but at least we know there are some quirky differences between the editor and player.
I have a very similar problem: I cannot use ANY custom shader, not even the default one you get creating a shader: it will render black playing the game.
I tried all your suggestions but no luck.
Unfortunately, I don’t know what’s up with your shader… I did figure out what was happening on mine, though I entirely forgot I posted here with the problem so I never updated the post.
In my case, turns out that the anti-aliasing was what was breaking things. It wasn’t causing issues in the editor because the editor seems to refuse to use anti-aliasing. Specifically, I had screwed up the render queue which, when confronted with anti-aliasing, completely failed to render. Transparency in the wrong queue will certainly make things go funny.
I ended up upgrading my project to Unity3, and the Shader I was using ended up not being compatible in Unity 3, so I just abandoned that Shader altogether, and created a new one (With a Node based shader editor).
Never did figure it out, but when you can't figure things out, sometimes its best to move ahead with something else.