I have a situation where a couple of assets seem to be missing once my game has loaded and entered the main menu. This only happens sometimes, and only on some machines. It seems like some form of timing problem, combined with asset loading/unloading. When things aren’t working, I detect that some assets are missing when the main menu is rendered, by making a RenderDoc capture, and observing that some texture layers are missing on some 3D models.
How can I find out the reason why those texture layers are missing?
The game does not use any asset bundles.
The game uses additive, async scene loading before entering the main menu. It does not use single-scene loading.
The game does not call Resources.UnloadUnusedAssets() at any place in the source code. (It is possible that a 3rd party library calls this.)
If I use the DirectX Debug Layer to generate a log of resource activity, I can see that both during “good” and “bad” runs, all the textures are being created as Direct3D resources at startup – however, during a “bad” run, some of those Direct3D resources are freed immediately when my application begins running (before it has completed loading in the main menu) instead of when I exit the game.
Can I log when Unity loads / unloads individual assets? Can I find out any more information on why certain assets are loaded / unloaded?