Unity3d and a geforce gtx295

Hi,

Does anyone know if Unity3d under WINdows could take full advantage of a geforce GTX295 card?

Those babies are dual cards, so 2 GPUs onboard.
Does unity3d interface nicely to these or is that just a stupid question and “OF COURSE” it works because thats the task of te driver?

Any idea how many times better performance we might expect compared to a geforce6800GT or a Nvidia Quadro FX5500 ?

Regards,

Bart

This should be driver related.

As for a performance comparison, i would google for a review.

Problem is on most sites they only compare te best card (which on geforce is the gtx295) with at most 2 lower cards like the gtx285,280 and maybe 260, but almost never with cards which are some generations of difference)

Any idea what would be the most important number to compare prestations? FILLRATE ?

I’d like to buy such a GTX295, but if 285 is almost as fast then thats a lot better buy nowadays.
The question is if unity would benefit of this dual GPU setup.

I mean just out of the stomach (i know that isn’t proper english :O) the GTX295 will outperform the older cards a lot which means by x factors and not just only because of the two GPUs.

For a draft comparison you can just google any gametest and compare these numbers. If you want specific numbers then you also can buy 3DMark as they also give you access to their database then, there you can compare your system with all the other ones who provided their results.

Just in case you still want to buy the 285: I have read many cases about those cards burning out a lot due to very high working temperatures - in a lot cases around 90 degrees celsius in a normal working mode. So if your system is not especially well cooled, I’d recommend buying the 295 instead. At least as far as I can see, the overheating problem should be solved with the 295.

the 295 is an impressive card
playing far cry 2 at 1920 with all dials turned up is an experience to behold

unfortunately unity doesn’t work well with this card and requires the second GPU be turned off, even then the perfs are inferior to a cheaper card such as the 4850

Woooops, so thats a bug in unity3d then?

I thought that if a program uses directX, it should work if your 3dcard has good directX drivers ?

Can imagine that one day dual gpu cards like gtx295 will become quite standard.

So advice is to not yet buy it if you want to develop unity3d content on it ?

What and how exactly does it “not work”?

doesn’t work = gives slower performances than a vastly less powerful card.

On all Unity content, or only your project?

We have all previous GeForce generations in our testing lab, but I think we don’t have a GTX series card. I guess we’ll have to get one to test.

Aras, if you want to include the card in the testlaab (which I greatly encourage) then it would be good to have the GTX285 (The top of the bill single GPU geForce card) AND the GTX295 (The top of the bill DUAL GPU geForce card) since they seem to both work on a different method somehow and I think its mostly the DUAL GPU thing which may break or make huge performance)

Regards,

Bart

gtx consistently underperform even on windows XP so that should make tuning easy
the island is also slower on gtx

@Aras did you test the HD4800 series cards as well ?
I was surprised to find that the 4890 performs equaly to the 4850 (and it’s not due to cpu usage - when I set everything to Fast, I get a huge increase in FPS, terrain shadows maybe?)

gtx 285 / 295 are single board SLI solutions
that should be pointed out as that makes a serious difference.

my gtx280 is hard to get below 100 FPS, thats what i even got in that “Crytek replication” webplayer demo with the nice lighting etc

Question still remains however how good unity3d webgames or standalones run on a gtx295.

I’m willing to invest in a good (supergood) new 3d card for running unity content on, but in that case it would be nice to know if unity really runs well on it.

As good as you were able to write your shaders to be sli ready.

The less they are, the worse the performance, at worst slower than when you disable one of the two GPUs / SLI

And what is your opinion then about the built-in shaders in unity?

Are they good for SLI or is that the biggest problem at this moment?

Regards,

Bart

If you set SLI to alternating frame rendering, the fullscreen shaders and alike will kill it.
Thats because most of them refer to the previous frame which in above SLI configuration simply means that they refer to the data of the other card, so the sli configuration can work at best as fast as a single gpu, but realistically even slower as the data need to be transfered between the gpu first.

you would need to make the shader to refer to data 2 frames back for 2 card sli or n-frames back for n-card sli / cfx or alternatively test it with other sli modes if possible.

So for working with the basic shaders you wouldn’t advice to buy a GTX 295 ?

Anyone at UT have comments on this? Its not a problem for today maybe but in nearby future it looks like SLI-on-1-card is going to become popular, but it wouldn’t be nice if shaders would limit there.

Propably Aras has allready an answer ready for that…

Only shaders that depend on the previous frame (like Motion Blur or Constant Stretch). Other image effects are self-contained, they do not kill SLI performance.

If you plan SLI, avoid things that depend on the previous frame being present. That’s it.