I was optimizing my code then as I run my game with the profiler I can see what happens when my game is lagging. and I saw that there is this dirty scene object that consumes cpu. Ive been googling it and cant find a clear answer. Anyone can explain this to me and for the benefit of everyone encountering this kind of issue? TIA
“Dirty” refers to a scene object that hasn’t had its state updated.
“Flag” and “bit” are synonymous in programming — they both mean a single micron of data that can be in one of two states. We call those “true” and “false”, or sometimes “set” and “cleared”. They can be used interchangeably.
When the scene object state changes, we set it. When we need the object’s state, we check the flag. If it’s set, we calculate whatever procedure and then clear the flag. The flag represents, “Is the scene object out of date?” For reasons that aren’t entirely clear, the traditional name for this “out-of-date-ness” is “dirty”. Hence: a dirty flag. “Dirty bit” is an equally common name for this pattern.
Sounds like EditorUtility.SetDirty; a scene object that is set to dirty keeps any changes made to it after the game stops playing and carries them over into editing mode.