Help with CommandBuffers, Camera Events, and whatnot.

Hi,

I’m new to the subject of injecting things into the rendering pipeline, but I’ve managed to get this working so far. The effect I’m trying to achieve is to render specific meshes to manually picked render textures, while ignoring others.

Now, I’ve managed to get that working, but, given my limited understanding, all the animated meshes are static. They have an animator and when looked at through the main camera, they work. But if I use any other camera that renders to a RenderTexture, they only show changes in the transform position, rotation, etc. but never the animation itself.

I’m feeling like I’m missing something but I was not able to find how the rendering engine works when it comes to animations. If someone could shed some light on that, it would be super useful.

I would have assumed changes in the meshes - is that what animation is? I’m not sure anymore! - would have been handled by the mesh renderer. But I guess not. Where is the “data” that represents the animation?

Here’s my logic.

   public void render(CustomRenderingContext context)
    {
        foreach(Renderer renderer in gameObject.GetComponentsInChildren<Renderer>())
        {
            DrawRenderer(renderer, context, -1, 0);
        }

    }


    void DrawRenderer(Renderer rend, CustomRenderingContext context, int submeshindex, int shaderpass)
    {
        context.graphics.DrawRenderer(rend, rend.sharedMaterial, submeshindex, shaderpass);
    }

I found the culprit. The skinnedmeshrenderer was getting disabled on Awake() and then rendered through my custom code. For some reason, this made is so that it couldn’t be shown animated.

I changed renderer.enabled = false; to render.renderer.forceRenderingOff = true; which fixed it.