For each frame executed, the Time.frameCount is increased by one. Based on what I’m reading here, some piece of code is setting lasRecalcualtion = Time.frameCount. I would have expected it to be done right here, but it should be done somewhere in this script. So what that line does is prevent repeated calls to RecalculateValue() from making repeated calls to ProcessData.AndDoSomeCalculations(). Note if you don’t find anywhere in your script that ‘lastReclaculation’ is being set to ‘Time.frameCount’, then your code should probably be:
static private var lastRecalculation = -1;
static function RecalculateValue () {
if (lastRecalculation == Time.frameCount)
return;
lastRecalculation = Time.frameCount;
ProcessData.AndDoSomeCalculations();
}
Since the frameCount is a signed integer, it should EVENTUALLY reach its limit and roll back to its minimum value (basically going from 11111111 to 00000000 in binary, I don’t know how many bits but that’s the idea. Look up “integer limit”). I think it would take more than a year of continual play at 60fps to reach this limit though, and then another year to finally reach -1!
So while it’s unlikely to ever create a bug, it will if you something stupid like set the frame rate to 10000.
So I have a question: Why would Unity use a signed int for this?