Screen Space Displacement Mapping (No Tesselation) - Progress

Edit: Now Available Here: http://oddityinteractive.com/SSDM/ssdm.html

or here directly if website ever goes down: https://gum.co/PpyqT

Hello everyone, just wanted to share progress I have made on my path to the concept of Unlimited Detail in games. I had been messing with point clouds and trying to find a system that would render high detail in VR. Along the way this concept came about. So I implemented and the results are really great. I believe someone came up with this concept about 10 years ago, but I was unaware of this, but it is a great idea, and I came up with my own implementation.

I almost dismissed the idea because I thought using the screen as a mesh would cost to much, but gpu to the rescue, I love compute shaders so much))).

So basically is is just a shader and camera that generate screen maps to send to the compute shader for Displacement and drawn with DrawProcedural. Unity just blows my mind with it’s ease of use and made this such a great task to explore.

Anyway I will be releasing it soon (hope after weekend) and just wanted to create an awareness for anyone who might be interested. I will be launching with price reduction (75%) off, so I hope people will see it’s potential.

Take care and thanks for you interest!

I hope you can bear with my videos, I will get better at this))

7 Likes

Beautiful! Good luck with publishing this asset! Do you have other assets as well?
At what price are you going to sell it?

1 Like

Thanks! Appreciate it! I did have some assets (Deep SSS and a few others) but they where written for 5.6 and no longer compatible with 2018 so I pulled them. But when I have time I will put them back after a revamp.

$25 is going to be the launch price (75% off), just want to make it accessible to as many people as possible. It will contain full source so I think it will be interesting to see what community can build with it.

2 Likes

looks great! This will work in VR ? Oculas and the soon to be out Oculus Quest ?

1 Like

Hi, Thanks! Yes for VR and I assume Quest but have no access to Quest to know if anything could prevent use with it. Unity does a good job of handling all this for us so I think will be good! Any limitations I find I will let all know).

1 Like

Thank you!! I am thinking beside ground, this would be great for buildings and such too.
Looks very good. And fast… I am waiting to get Quest too… I am guessing it might sell
far better then rift. I imagine Unity will support quest fairly quickly too ( I hope ) .

Am excited to see this in the store… I am retired so will need to get it when it comes out.
Thanks so much.

1 Like

Cool). Yes you are right, I think quest will be very cool and open up many possibilities. Unity always just surprises me with it’s such great integration and ease of use with everything, I am sure they are already on it.

Oh yes, actually the SSDM can be used on any mesh even with animation. There are just some things to be aware of when creating content (displacement maps and the low-poly counterpart) but I will have a list of helpful tips for this.

1 Like

Is this similar to what Euclideon does in their point cloud engine? They were initially going for gaming but totally blew it when they wanted to do everything with their engine (NIH syndrome). It was a real shame because it is perfect for things like rendering dense grass fields. So your hybrid approach sounds a lot better. Could it be used for grass?

1 Like

Very interesting. Curious to see if this could be used with something like Megasplat for terrain. The Megasplat shader does have a system for custom addons.

2 Likes

hey guys!

@Elecman , Honestly I cannot speak for Euclideon or their point cloud render and search algo, but after all my experiments over the years this is the closest I could think to equate to how Eulcideon Island was done (not the pointcloud renderer). It did blow me away and do not know if we will ever know for sure. So if they generate meshes from point clouds and generated height maps and as In this, if all pixels on screen are points then with a screen mesh of w*h triangle theoretically could be a similar approach.
I have not tried grass actually but I am sure something could be done with some modification for a second pass and distorted in it’s own final DrawProcedural shader. Plenty of things to experiment with for sure, even video or dynamic textures could produce some interesting effects I think. Alpha is clipped so only non alpha position data is carried accross.

@Bodyclock , it just comes down to the shader being able to export position, color+ lighting, normals and heighmap info to the MRTs and the low poly mesh or terrain being on the DisplacementMappingLayer. The rest is handled in compute shader automatically from there. So if could mod Megasplat in such a way then I do not think it would be a problem.

Some thoughts over on the [Megasplat thread]( https://discussions.unity.com/t/646064 page-90#post-4100794) where I mentioned your product. I think you mention the problem of the thin sections in one of the videos. Looks like it could be integrated with Megasplat reasonably easily. It might be worth starting a conversation with Jason.

1 Like

Thing is tho with tight integration … there is a problem with Megasplat and Gaia being in the same project…
if* megasplat is in it seems that Gaia window can’t start ,

what ever the cause, if this happens with this asset, I will be asking for money back. I do not use any of that devs products, but I do use Gaia, so I would not want to see that stop working and be other problems.

I am referring tp this post GAIA - AAA terrain generator, procedural texturing, planting and scene creation page-197#post-4072513

1 Like

That post refers to Microsplat, which is a different product completely. And this would not be a tight integration, merely a customisation to the Megasplat shader. And this is all speculative at this point and merely an exploration of possibilities. You may be jumping the gun :wink:

Maybe so, but it might be the shader that is causing it some how, I do not know for sure. Just saying that I use
All of Adams products as do many people.

@Bodyclock , thanks for asking. As he said there would be no problem to make a modification for his product. As all source is included should be very simple for him to make a version that could use this, if he could see a benefit to it. Which is great! However this is forward based so not going to g-buffer but to MRT’s but I do not think would be problem to work with g-buffer(will have to test).

An artifact with polygons of >90 degrees happens even with normal maps and they are still used. With beveling of edges can reduce this artifact and pixel clamping can prevent bleeding. I am not sure how others implemented in past because I have only seen two videos of it which I only found a few days ago)). but I think for a vast amount of things this technique could be useful, especially things like telepresence - kinect depth maps and textures. So the hard thing for me is content and finding a good pipeline for this, however from the results I am seeing I think it is worth figuring it out)). Of course though anything that even resembles an issue or artifacts I find, I will be making sure everyone knows about it!

Edit: oh yes with what he said about SSR that could be potential issue, but with VR multiple camera would mitigate most cases as the comparison isn’t exact as just needing a way to show further around the mesh as they don’t need to see what is offscreen, but rather similar to the >90 degree thing I talked about. I am sure there will be some cases it is not suitable for sure though nothing is perfect).

I’ve used MegaSplat/MicroSplat and Gaia in the same project before- not sure what your particular issue was, but there shouldn’t be any issue between them.

The video mentions that the shader is using vertex/fragment shader. Will it work for deferred rendering path?

1 Like

@castor76 , I believe it will however at the moment the custom shader used for displacement are forward based - output to mrt and displaced in compute shader - draw in unity with draw procedural. I am sure it would be alright for the displacement camera to use the forward and then rest of the scene can use the deferred rendering. I will take note of this and see if can be combined. Thanks!

1 Like

Humm… interesting. So you use the second camera just to draw the displacement using compute shader. How does the rendering pipeline works with the result afterwards? Or are you saying any model with displacement is rendered in the second camera and the rest of non displacement objects are rendered in the other camera?

1 Like

@castor76 , yes a second camera and layer are used for the meshes that have displacement applied. The final results put into dynamic buffers and return to the unity pipeline via DrawProcedural (OnRenderObject) to be draw in the main camera, so the shader I am using at that stage is just a forward shader too, but at that point I could also have option for deferred version I am assuming. So it might be alright.

1 Like