He’s not using Unity for it, it’s using his own custom made Path Tracing engine written in CUDA, he’s just using Unity’s robot lab scene. Personally I don’t see the benefit for game development having it implemented into Unity. It is amazing for path tracing and makes some incredible looking still shots, but path tracing still isn’t at a point where it could be used for real-time games. That was being rendered on 2 (!!!) GeForce Titans and it still took a couple of seconds between the camera coming to a halt and the frame reaching full fidelity without any noise. If you add moving objects to the scene it would never be constant noise. If it were used in an FPS game (as such high quality graphics would be ideal for) then the camera movement would be so perpetual it would make it quite difficult to play through the noise.
The only purpose I can see which it would be suitable for would be architectural visualisation.
I really can’t wait for this kind of technology come out! And there’s just no reason for Devs not to dig into it… in fact there are some people already doing some nice stuff for Unity, with promising results…
Well, Unity’s primary use is for games, but this path tracing tech could be used in many applications that don’t need real-time rendering… in games, a nice application could be a switch for the player to take very highres and crazy quality screenshots, would be really really nice! o//
@ Devilbox Games
2 Titans might be a lot now…but how much do you expect those cards to cost 5 years from now or by the end of the next console generation? And how much better will the cards in general be by then…I mean just compare say the GTX 9800 (128 cuda cores) from 5 years ago to the GTX 780 (2304 cuda cores, close to the Titan’s 2688 ).
Personally I’m really excited for this tech because that is the direction things are going, and the sooner game developers can start messing around and getting familiar with it, the better. If it becomes available for Unity that’s fantastic because it’d make AAA graphics and lighting that much easier…
The point is that 2 Titans is not enough to handle Realtime Path Tracing at an acceptable performance level for games. It still takes a second between the scene becoming idle with no movement and it being rendered at full quality without noise. That’s unacceptable for games, especially when you consider how scenes are never completely idle in games. To get it running at 30 fps without any noise or artifacts that would require 30 times the power, or 60 times the power of a single Titan and that’s only at around 720p. To get it to 1080p at 60fps, which most games expect these days it would require twice the power for the frame rate and more than twice the power again for the resolution, roughly 270 times the power of a Titan, and that’s assuming linear scaling which probably is not the case given how Path Tracing causing stalling on the GPU cores when thread divergence happens and more pixels being rendered means more chance of thread divergence and more power needed to keep the performance.
Given how the Titan is 1.5 times the performance of a 680 and using that as an annual increase in power of the top end cards, a rough guestimate suggests that the top end card from NVidia in 5 years time will be around 8 to 10 times the power of the Titan. New architecture could allow for more improvement so let’s have a conservative guess of having a card which is 20 times the power of the Titan in 5 years time. Two of those top end cards would still not be enough to render it at 30fps at 720p in 5 years time. In 10 years time we may just have a single card capable of doing 30fps at 720p, when 4k and 8k displays will be pretty much standard. That doesn’t mean it would be worth while for a game developer to make a game with it at that point even if that low resolution and frame rate were acceptable. That level of hardware needs to be mainstream before it’s worthwhile for developers to use it, say that year’s equivalent of a GTX760 card. So add another 2 to 4 years on for that.
10 - 15 years is a very long time from now and if it takes that long for this quality of path tracing to reach 30fps for people with mainstream hardware, imagine how incredible rasterized graphics will be by then. Path tracing is great tech but the power requirements mean it simply won’t be able to catch up to rasterized rendering for a long time.
We had that discussion some days ago in another thread here.
Full realtime path tracing for games in a acceptable quality is not around the corner. Not even close. Even in the next decade. It’s more likely that rasterized approximated rendering reach the quality of path tracing renders in the next couple of years for consumer level hardware. And if you look at some realtime tech these days, that already shows up.
Realtime path-tracing isn’t suitable for gaming at the moment and I’m not sure if it will ever be. There are however applications where this technology is usefull and applied already (Vray RT). At the moment it can be seen as a bridge between online and oflline rendering. An application for Unity would be having the beast baking rendering engine replaced by a realtime path-tracing engine. This could potentially accelerate the baking process by huge amounts.
There has been realtime raytacing done in other quarters, usually by making certain assumptions/limitations about the kind of geometry that it can contain etc, usually fixed models and maybe limited on how much reflection/refraction is involved etc… some of the demoscene demos do this. But to make it really flexible and generic and able to handle any kind of game is I think much more of a challenge. I think you can definitely do path tracing without reflections/refractions at a decent framerate, it’s much the same as a screen-space path tracing renderer without any bounces. But the real benefits come from being able to do refractions, reflections and shadows/lighting much more accurately, but takes a lot more effort.
It is well known that photon mapping gets you most if not all of the benefits of path tracing, but with less noise and less computational effort.
I think I’ve seen some SIGGRAPH papers on improvements to photon mapping that bring it closer to real time.
I’d say that real time photon mapping is coming sooner than real time path tracing, and it will look just as good.