We have many meshes that we render with a large number of shaders. Over all of them, we would like to add a very an additive effect for an emissive glitter term.
However, the meshes are rendered with a very large number of shaders, many of which have quite flaky, and in the case of some assetstore shaders, obfuscated, source code. So we’d really like to render this as an additional pass of all the meshes to avoid managing the complexity of adding this term to every shader.
What is the recommended way to render a mesh multiple times with different shaders in Unity? Off the top of my head we could:
Clone all the meshRenderers, MeshFilters and SkinnedMeshRenderers onto child game objects, and render them with our glitter shader, in a transparent layer so we know they’re drawn after the main meshes, but at the same position.
Add a component on the same gameobject as every Renderer that calls Graphics.DrawMesh on every mesh, but ??? don’t think this works with skinned meshes?
???
Note that I don’t want to do this as a post-process. I left out some details, but assume large numbers of per-object textures are referenced so collapsing into a full screen postprocess isn’t viable. Additionally, we’re on mobile, so forward shading only is fine, at least for now.
Nice. Yep, shader replacement with an additional camera looks like a great way to do it (and is much less hassle than cloning the objects, which involves either instantiating way more than you need then destroying extra non-renderer components, or the alternative of creating your own renderers, which requires a lot of special case MeshRenderer/SkinnedMeshRenderer/MeshFilter cases.
Or, just add the material to the existing renderer components as an additional material index.
A mesh with a single material ID but used with a renderer component with two materials assigned will be rendered twice. If your mesh has multiple material IDs, just add an additional entry for the sparkle material for each material ID.
bgolus, if there are multiple materials involved, this technique does not work. I have a thread open and I went back and forth with another member a couple months ago on this topic. I was never able to resolve it. I want to add outlines in my MMORPG game for a selected object. Apparently, if a renderer has two materials, any new materials you add to show your outline can only be applied to the part of the mesh referenced by the 1st material index (or was it the 2nd one?) Either way, it does not go 1 2 1 2 1 2 as you add more materials; instead it’s something like 1 2 1 1 1 1… Any other suggestions?
bgolus you are awesome!!! After seeing your post, I researched command buffers and DrawRenderer and I came up with this:
Material _outline=null;
void OnRenderObject() {
if(_outline==null) {
_outline=(Material)Instantiate(Resources.Load("outline"));
}
Camera.main.RemoveAllCommandBuffers();
if(selObj!=null) {
var cb=new UnityEngine.Rendering.CommandBuffer();
cb.name="outline";
foreach(var rend in selObj.GetComponentsInChildren<Renderer>()) {
for(int i=0;i<rend.materials.Length;i++) cb.DrawRenderer(rend,_outline,i);
Camera.main.AddCommandBuffer(UnityEngine.Rendering.CameraEvent.AfterEverything,cb);
}
}
}
It works BEAUTIFULLY for outlining my selected objects in my game
Am I doing this right by removing all command buffers on every call to OnRenderObject unless I have something selected, in which case I add one every time. I mean it works perfectly but I don’t know if this is optimized.
My setup is when adding or removing selected objects I add or remove only the command buffer linked to said object. That way I’m not making new command buffers every frame, which can be a lot of garbage.
Thanks. I updated my code to only make the command buffers if the selected object changes. After messing with this for a while, I noticed problems though. Other graphics will get in the way of the outline, like a building or another person, which I thought was acceptable, but then I noticed that even fog gets in the way. Is there a way to make it draw my renderer on top of everything (except the GUI)?
EDIT: I tried messing with ZTest in my outline shader but that will just make the entire figure glow through everything. It obviously needs to be blocked by the figure being outlined. But I don’t want it to be blocked by other stuff. That’s kind of a dilemma.
Are you sure your highlight shader itself isn’t applying fog to itself? If you’re rendering them during CameraEvent.AfterEverything then it should rendered using the forward rendering path, in which case if your shader has UNITY_APPLY_FOG in it it’s going to have fog applied.
Please use the [ code ] tag instead of just Courier New for large blocks of code.
Use ColorMask 0 for passes you don’t want to render any color value, but for which you do want to render depth (or stencil) values.
Don’t use a Fallback for shaders that don’t need them and you don’t want them for. This shader isn’t supposed to be rendered with diffuse lighting or have a shadow, so why use that as the fallback?
It’s generally best to avoid using fixed function shaders. They’re mostly deprecated and get converted into vertex fragment shaders at this point, but they’re doing way more than you need as by default it is calculating fog though it’s not using it.
Don’t pass uniform values that don’t change between the vertex and fragment shader functions. I realize this is just copied code from the built in toon outline shader, but it’s inefficient. This was someone trying to be clever and doing more harm than good.
The vertex position output semantic in the v2f struct should be SV_POSITION, not POSITION. Also the fragment shader should use the SV_Target semantic instead of COLOR. to I suspect this is because you copied this from a very old version of the outline shader.
If you want to hide inner details, you would probably be better served using stencils.
Lastly, I don’t think your issue has anything to do with fog. It has more to do with using an old version of the outline shader that has a key piece missing that causes the outline to shrink when you move away rather than stay consistently scaled.
In the latest versions of the outline shader it has these lines:
#ifdef UNITY_Z_0_FAR_FROM_CLIPSPACE //to handle recent standard asset package on older version of unity (before 5.5)
o.pos.xy += offset * UNITY_Z_0_FAR_FROM_CLIPSPACE(o.pos.z) * _Outline;
#else
o.pos.xy += offset * o.pos.z * _Outline;
#endif
I’m so sorry about the font size and type. Yeah, this was a shader I found online.
I tried applying as many of your tips as I could to that existing shader and I ended up with: GLSL link error: ERROR Input of a fragment’s shader ‘vs_COLOR0’ not written by vertex shader".
Do you happen to know where I could find a stencil shader that does the outline I need? If not, at least maybe a vertex shader that is outline only, scales with distance, and shows through everything? I’m a good C# programmer but I don’t know anything about shader code. The code really confuses me.
How funny, I didn’t even need my command buffer code inside OnRenderObject. I have since removed it from there. It has nothing to do with that event. It’s attached to my main camera and it stays that way until I remove it. I have that part of it working perfectly. It’s just this dang shader
Hi!
i am looking for a solution to a similar problem but for a 2d sprite. But the sprite renderer only takes in 1 material? What i wish to achieve is to render the same sprite a second time with a slightly different setting (with mask applied and a shader with stencil comparison) but not having to duplicate the same gameobject.
sorry to resurrect and old thread…
I have similar issue… I’m doing something similar, custom shadow render… I have all my shadow caster objects objects in a collection, and receivers in another collection, I render casters top down to a texture. …then have a custom shadowReceiver shader, and I render all the receivers… via a second camera (clone of the main camera), but with the replacement shader technique, and my custom shadow render shader, in OnPostRender. all (mostly) works great… but with one or two problems…
Main problem…
If I use any other image vfx in OnImageRender… it somehow wipes out anything I render during OnPostRender. The image fx are from “Colorful” they don’t write to z-buffer.
Unity docs do not explain timing of when OnPostRender vs OnImageRender render. So where has my render gone ?
The only thing I can imagine… is that OnImageRender buffers the main screen before OnPostRender… then runs its shader, then writes back to the screen buffer after my render… hence wiping over everything.
The ‘colorful’ filters appear to just use bit or with a custom shader… that don’t write to Z buffer. So if my code is running after… it should still render something visible… it doesn’t. its just gone.
What is going on? how can I do my render after main camera render, but before OnImageRender ?
Minor issue…
It would be nice to do OnPostRender… after opaque objects, and before transparencies… is this possible with OnPostRender ? or should I dump the whole system, and do something more like extra materials added to each object, and add and remove them as I need to enable and disable them?
@bgolus …any ideas please ? help appreciated, thanks