First, let me provide an overview of what is happening in this project. I am running a crowd simulation with a compute shader, then drawing the simulation using DrawMeshInstancedIndirect. Depending on the camera distance to each instance, they get assigned to different LOD meshes.
This all works perfectly fine in Editor (which was running with the DX11 API). But the device (Meta Quest) build needs to use the Vulkan API. In build, this effect appears to be pretty screwed up - certain LODs don’t render correctly or at all, which never happens in Editor.
After trying to debug a few possible issues with the sim itself, I was able to repro the issue by enabling the Vulkan API for Editor. I then started experimenting by adding some GL.Flush commands where order mattered (eg. in-between setting parameter values on the compute shader, and actually dispatching the shader) and this seems to have fixed the problem.
I then found resources that suggested that unless commands are issued as part of a command buffer, they are not guaranteed to be executed in order. So, I am thinking that this was the cause of the issue (particularly when it comes to rendering each LOD, which requires one dispatch of the shader, with different parameters set on it, per LOD).
But now that I am reconfiguring the calls to dispatch the shader so that they use a single CommandBuffer (which, according to my understanding, should be the way Vulkan prefers to receive all API calls), I am simply not seeing anything show up at all.
When I switch the Editor API back to DX11, the effect renders properly.
I could just leave the GL.Flush commands in place and call it a day, but it feels highly hacky and likely has poor implications for performance on device. Does anybody have an idea of what might be going wrong here?