Stunning demo scene of real-time raytracing multi-bounce reflections and refraction

Hi,

I don’t follow the demo scene as much as I used to, but came across this stunning demo from Fairlight the other day.
The real-time raytracing with multi-bounce reflections and refractions is quite breath-taking, though it really comes into its own just beyond half-way through the video when they introduce shattering into the mix. The fluid/metaball simulations stuff is rather impressive too.

Recommend setting the resolution of the video to HD.

What is really nice is that the guy who does this keeps a pretty informative blog (DirectToVideo) about his demo’s explaining (all be it at a rather high level) how he went about creating the effects. Well worth a read if you are into real-time graphics.

thats cool, wheres the guys blog you speak of?

I like the depth of field effect. Nice-looking polygon-shaped bokeh.

Ah, well done, you spotted my deliberate mistake, you get a cookie :wink:

Direct To Video Blog

His stuff about ‘a-thoroughly-modern-particle-system’ is great for a read, and still pretty cutting edge (I think) despite being from 2009. I suspect most of it is now possible to achieve in Unity 4.0.

This will be a nice test for some Unity3D shader guru!

This demo employed a clever trick that led to me being more impressed by the end than i was at first.

A the start, all the geometry was very simple, just an extremely small number of polygons.
Raytracing such things is not computationally expensive at all, so with a good GPU raytracer it really isn’t that impressive that they could pull off some reflections.

However, towards the end they introduced complex, morphing implicit blobby objects, and even fracturing into many many small polygonal shards.
They must have used a sophisticated dynamic KD-tree or some such cleverness.

Good to see that such cool raytracing may soon be doable in games.

Demos like this usually, traditionally, employ very sophisticated very hardcoded special-case code which only works within extreme imposed limits, like a very specialized niche… .which is how it’s able to seem to be ahead of its time, as has pretty much always been the case with the demo scene. To get it to where it’s flexible and generic enough to do whole regular scenes without lots of setup or special restrictions will likely greatly impact the performance so it might be a while yet before it really becomes flexible.

If you read his blog that I posted he goes into great detail about the methods he tried, including various tree spatial partitioning. However in the end he actually discovered for his purposes a simple two stage grid (i.e it was a grid, with each cell split into a number of buckets). From memory the driving force behind this decision came from the requirements to have dynamic geometry, so updating the spatial partitioning and uploading that data to the gpu was paramount.

Though the scene does start off simple, its still 50k + polygons, which for raytracing with multiple bounces is quite a lot. However as he was only interested in reflections/refractions he didn’t need to raytrace everything, he avoided the initial camera to object trace and just started with the first bounce.

However I do agree that its really the last half of the video that is truly impressive, I loved his metaballs/fluid morphs and the shattering effects. The start is good too, it just goes on for too long without much happening and the visuals are stuff we are used to see pre-rendered, so despite it all being real-time maybe the start doesn’t seem as ground breaking as it really is.

Whilst generally true, I feel from reading his description of the method employed its a lot more generalised than you might think. Though by the same token its also still quite restrictive, there is no texture mapping for example.

Mind you if you read is ‘thoroughly modern particle system’ from 2009, i’d say all of that is easily doable on gpu’s today and actually ones several generations older. So its not always the case that this stuff take ages to filter through.

However ray-tracing is certainly developing at pace on the gpu. I wouldn’t be at all surprised to see AMD/nvida moving to support this mixing of polygons and ray’s more in the future. Its a nice compromise between the two systems.