We are making a 2D game, and heavily using 2D sorting layers. Everything is fine on that end, but one part of our technology base is some custom code that uses Graphics.DrawMesh, as listed here in the API… Unity - Scripting API: Graphics.DrawMesh
Now the Graphics.DrawMesh call does have a layer argument, but I believe that is not refering to sorting layers, but instead the old layer system. Its hard to check though, as the link for the layer argument is a dead link.
Anyway, if I have other game objects, I can call this kind of code to make them work with sorting layers…
_renderer = gameObject.GetComponent();
_renderer.sortingLayerName = sortingLayerName;
I do not see anyway to get a renderer object from a Graphics.DrawMesh call. Does anyone know a way to get the call to Graphics.DrawMesh obey sorting layers to integrate into our 2D game scene?
But you could use more than one camera.
For example use Camera 1 to draw 2D stuff from layer 0-10, Camera 2 for your mesh, Camera 3 for 2D Stuff from Layer 10 upwards.
Then you set the rendering paths of these cameras and it should work.
Yeah that came up in our discussions , but we were really hoping there might be a way to do it without multiple cameras. Would be nice to hear drom a Unity dev , to know if they have considered this.
I’m surprised that it’s been 2 years and there’s been no response to this yet. It seems like quite a major issue that would be fairly easy to resolve, if I understand how DrawMesh works correctly…
I need this feature to make use of the far more efficient version of XWeaponTrail that shallway released which uses it. I could manually convert all the work back to using a mesh object… but this seems unnecessary…
it still has not been addressed, as far as i can tell. DrawMesh appears to be drawn as sorting layer 0, no matter what you do. You can use command buffers to inject at various points in the render pipeline, but that is horrendously slow on platforms that do not support instanced drawmesh calls. (1000 drawMesh calls every frame is ok vs rebuilding a command buffer with 1000 drawMesh calls every frame = crap framerate).
I want this, too. I’m currently making a rendering system that Graphics.DrawMeshInstancedIndirect() to render thousands of sprites efficiently. I’m stuck because of this. How can we make Unity notice this?
I wonder how ParticleSystem works…
It can render many meshes without having to spawn GameObjects for them, and you may use sorting settings.
ParticleSystem would be enough for me if I could set MaterialPropertyBlock for each particle, but it seems like I can’t…
To provide more context, at the moment it is really difficult to extend the renderers to do anything extra. For example I want to implement an a variant of the SpriteRenderer to allow more custom data per vertex, and there is simply no easy way to do this for something this basic, because
SpriteRenderer quickly descends into native code and has no interface for managing custom sprite mesh.
There is no way to inherit or extend a renderer. I can’t inherit from Renderer and add it as a component.
Graphics.DrawMesh doesn’t work with Sorting Layer. Sorting layer is tied to renderer per instance, and without the renderer nothing works with the rest of Unity.
I ended up writing a companion SpriteExtendedRenderer to manage the states of a MeshRenderer to simulate the behavior of a extended SpriteRenderer, except the sorting point option which seems exclusive to SpriteRenderer. Annoying as hell and extremely cumbersome.
if you want use Graphics.DrawMesh, sorting by render queue or different shader tag , cuz sorting layers just define a render pipeline do draw call order , you can’t do Graphics.DrawMesh between the render pipeline (unless you re-implement it)
I am fully aware of this and that’s why unity should really fix it, becasue having multiple subsystems that don’t work well with each other and lightly asking user to re-implement an entire subsystem is not an usual take by game engines.