I am developing an AugmentedReality Projekt where parts of the mesh are located under the floor (Imagine a house with a basement level).
To get a realistic visual effect I need to “open” the floor over that basement area.
Are there any ideas how to do that?
Thanks
Carsten
Another way would be a shader that makes an object (a fake floor mesh with a hole in the middle) full transparent and culls all objects behind it.
Would this be possible?
Create a mesh that represents the floor and give it a material with this shader (remember and cut out a hole for the bit you want to see through).
It’ll write to the ZBuffer, so it’ll occlude any geometry that’s behind the polygons of the floor (because a depth test will tell it your floor mesh is in front of it, so don’t render there) but it will be otherwise invisible.
The Queue of Geometry-50 that this shader will make it draw before the rest of the geometry in the scene.
The ColorMask 0 means this shader doesn’t render to either the RGB or A channels of your screen output (but WILL still render to the depth buffer).
Any idea how to recreate this using a fragment and vertex customised shader program? I need to apply this effect along with a couple of others and am struggling to understand how to output this
The important part is that tag of “Queue” = “Geometry-50” - this ensures it draws before things with “Geometry” as it’s queue.
So anything you write - either custom vert/frag or surface shader - that has a tag saying “Queue” = “Geometry” will be hidden by a mesh with the above shader applied.
Which skybox shader are you using?
Can you post the ZMask shader if it is any different from the one Farfarer posted?
What are the “Clear Flags” of the camera?
For what it’s worth, I was testing it with the default skybox shaders (the regular 6-image skybox and the cubemapped skybox) and the mobile one (which is the same as the 6-image one but without a tint, I think).
Also created a custom skybox shader based on the 6-image one and messed with it’s render queue and ZTest settings… to no avail.
Camera has to have Clear Flags set to Skybox otherwise it doesn’t render it at all.
I guess you could do a Source Engine thing and have a second camera in the centre of an actual box, with it’s rotation slaved to the main camera, and set it to render first.
I appreciate this is an old thread - but this is exactly the shader I’m looking for at the moment. I’m getting the clearing issue as well. I’ve put a unity project together to see if anyone can help us get this working.
The sphere in the test scene has the shader above attached to it, and I’ve extrapolated the unity skybox shader into a modifiable shader file. The skybox just reuses a single texture, just to save zip file size. Looks naff, but does the job.
Weird thing is, the scene view is rendering the shader properly, but the game view is displaying the artefacts.
Any help is appreciated, I’ve been reading about Shaders / Cg for days, but I’m still yet to figure out what I’m doing - it’s sinking in very slowly
I just wrote a similar shader for a different thread, and I noticed it suffered from the same bug after reading your post.
I fixed it by creating a dummy camera (culling mask set to “Nothing”) with a lower depth that the main camera. Clear the dummy camera to “Skybox” and set the main camera to “Don’t Clear”.
This seems to work properly. I don’t know why this would be necessary. It looks fine in the editor, but doesn’t clear properly in-game. Strange.
Hmm… luckily, in my main game project I already do this (or at least something similar that I can play around with) as I have different cameras for rendering things at different distances. I’ll give it a go… though you’re right, I’m not sure why I need to!