I just spent several days trying to figure out why I kept getting an error when combining a specific mesh, only in build. It was SO strange, since it was only due to that mesh, and doing certain things would “fix” it, but in an incredibly strange manner.
Turns out I stupidly enabled “Optimize Mesh Data” (Strip Unused Mesh Components) in the player settings without knowing what exactly it does.
Edit: turns out its on by default, so it wasn’t stupidity
Soo yeah if you’re having strange issues with meshes, try this simple trick! And don’t randomly enable features without knowing what they do. :^)
I’ll go back to banging my head against the wall.
Good luck :3
Related note, it’d be cool to have the option to choose what components are stripped, sort of like how you can choose what vertices are compressed.
Thanks for the warning, but uh, isn’t it common sense? Do we need to be told that pressing things without knowing what they do cause problems? Honestly?
I also had a problem with “Optimize Mesh Data” that took quite a few hours to figure out. I was loading Normal Maps at runtime, and they wouldn’t show in the build because that feature was enabled.
Scary part is Windows 10 just asks if you’re positive you want to format. I wasn’t crazy enough to hit Y to see what would happen on my main system. If I had a VM with snapshots handy I would totally have tried it.
I just checked. The checkbox is on by default. So I don’t think the OP “stupidly” enabled it. It was always enabled.
It used to be very safe to leave on, but (semi)recently they changed its behaviour and it has been known to obliterate more data than it should (I have filed at least one bug report on this).
We just encountered this issue ourselves! We have meshes for the ‘characters’ in our game that contain special ‘skinning’ data in UV0, UV1, UV2, etc. (Skinning as in ‘ingame cosmetics’ ). We have different shaders for different skins that make use of the various different UV channels for different effects.
The character mesh is assigned to the character prefabs mesh filter. The mesh renderer on the character prefab has a default material that only uses UV0. At runtime, we swap out the material to different ones that use UV1, UV2, etc. It works fine in the editor, but when the game is built these additional channels are stripped out since Unity thinks that the mesh will only be rendered with a shader that uses UV0.
It seems like this assumption is fine in most cases, we’d love to keep Optimize Mesh Data turned on, but there is no way to tell it to exclude meshes from the Optimization process so we have no choice but to turn it off for the entire project, just because of a handful of meshes that need to retain their UV channels.
The alternative way is to have a scene that contains every character mesh withe very possible material assigned to each of the meshes, and to include this scene with the build so that Unity wont strip out the channels, but that doesn’t seem worth the effort for us. (size between optimisation on and off was about 2mbs for us, so not worth this hassle to save 2mb)
Same issue with me. Spent an hour to find that this was the cause of incorrect half4 texcoord3 UV mapping on Android (using 4 UV channels in custom CG shader)… when turned off, everything works. It didn’t exclude UVs, just some coords were incorrect.
Have you tried to disable it for the entire project and write an AssetPostprocessor that removes unneeded mesh data, following whatever rules are needed in your project, instead?
That way “optimize mesh data” would work even inside the editor and you have per asset control.