Ther is one project pubblished in two version : Webplayer and standalone player. I see (me and our pc users) two quality different result on various graphic cards (various Pc users).
I don’t made any particular settings on graphic cards.
I had poor result on quite old graphic cards (Radeon X800, Geforce 7900 GS) but i didn’t use any particular shader or effect; only lightmap shadows calculated on unity3.
8800 GTS and shadows are looking good, but performance is really bad. I turned it down to good and still only got 7FPS (did not change from Fantastic to Good at all, no speed difference).
Something is seriously unoptimized, because I’m getting like 10fps on my 5870. From what I can see it should be 50X that speed. Also there are some strange visual artifacts like the camera is using Don’t Clear for the clear flags or something.
I saw no problems and the demo ran at max fps no matter what I did. I also did not notice a change of quality in any of the quality levels except fastest. Simple, good, fantastic all looked the same.
I’m on Windows 7, 8800GT, 8gb ram, 2,4ghz quad core.
However when I turned on show triangle count or vertices count the demo slowed to abysmal framerates. Not sure how you managed to make two numbers displayed on screen slow the demo down by a factor of 10 or more.
Yes (for Eric5h5) The model is unoptimized yet …i’m stopped when i did see the bad shadows quality. (not in my work pc).
very different performance!.. ( for Elmar Moelzer) in my very old home PC( 1 gb ram - Athlon XP 2400 - radeon X800) i get full frame rate 60fps (fantastic quality settings…strange!).
But, it’s true (for TwiiK), with triangles count and vertices, fps go down… lol.
I’ll investigate the script code. bat is the same menu posted in the unity-wiki with my simple modifications.
My first problem is the shadows too compress or too less blur on lightmap in old graphic cards.
Sjm Tech, I now get tons of FPS too (as I mentioned earlier), it was the statistics that slowed the whole thing down. I love statistics, so I had them turned on the first time. Now things are running smoothly at Fantastic setting as well
So no worries there!
Either the graphics drivers are screwed on that second screenshot, or it’s running at 16-bit colour depth or something crazy like that. The graphics cards you mention are not what I’d call “quite old” at all, and should chew through a scene like this without any framerate or quality issues. This is assuming your geometry or scene settings aren’t mad, of course. If these are lightmap shadows, then take a look at the lightmap texture itself and see what’s going on. That banding suggests that the texture isn’t loading correctly on the other card.
It doesn’t explain the problems you’re having, but shadow quality is going to be directly related to topology and triangles on the mesh, too. If you have gigantic stretched or uneven triangles and problems like flipped normals and coplanar polygons/z-fighting, you’re going to get awful shadows. It looks to me like some of those walls are made of two triangles, yet they’re about 15 feet tall. You won’t get nice shadows on things like that.
thank Elmar for update!
hi xomg, my 3d model is not a good optimized model, and is it too big for make a good shadows baking. The shadows (1024) are wrapped on a big surface and i expected a poor result. My big disappointment is to find a different result on different PC. I’d rather find all bad result ( and try to optimize in depht my model) instead to find more (good and bad) result!
AcidArrow, dreamora … yes pixel shader 3.0 is full supported (GeForce 7 series - Wikipedia)
you think that it can be a pixel shader question? my work pc with 8800 gts has 4.0… mmm… i’ll do more test.
thanks
new test! same surface extension but only unity gameobject: 1 plane, 1 cylinder, 1 sphere and 8 cubes.
Same bad result on 7900 gs and radeon X800!
Good result on 8600 gts and 8600 gts!
I did do many test and the conclusion is that the problem is a bug on lightmap display engine. the baked shadows are displayed good only on a recent graphic card with pixel shader 4!
I did try with a smal simple model with a default plane a cube, a sphere and a cylinder.
new update! I found the cause!
The cause of issues is the texture filtering mode (bilinear, trilinear or point) of the lightmap .exr file.
The unityplayer disable bilinear or trilinear filtering on vertex shader 3 graphic cards (for now) and instead use point filtering mode.
In my last test i set filtering mode to point (lightmap exr) and i obtained same (poor) result in all graphic card that I tested. (shader 4: 8600 gts, 8600 gts; shader 3 7900 gs, 6800; shader 2 : ati x800. )
I hope that someone of Unity team see my trials to resolve (or teach me how resolve) the lightmap visualization problem.