I’m working on a project that is meant to be distributed on the web and recently posted a build of it to have my family kinda idiot-test. One of the responses was:
I’m still trying to track down further information from him like what video card he’s got; but in the meantime, is there some weird setting that I’ve got wrong?
I’ve run it on my Macs and PCs here without a problem…are there known problems with the Unity Player and certain video cards?
Do you force all pixel lights? I know my old GeForce 5200FX will only run well with two pixel lights in a scene max, and perhaps their card is just choking on it in general?
Like Targos said, if you are using any “advanced” shaders on the buildings, try with just a basic diffuse.
Actually, now that I think of it, I saw a similar issue once testing something on a GeForce MX card, IIRC I was using the bumped-spec shader, for what it’s worth.
Unity will fall back to the right shader in that case. You can preview shader fallbacks using the graphics emulation drop down in Edit → Graphics Emulation
Hmmm. OK…thanks for the tips everyone. I had baked all the lighting in using C4D, and so all my materials were Self Illuminated - Diffuse, but I had turned the Ambient Light completely off.
I’ve turned it back on, and I’ll have them give it a try.