What is a good way to slow the agent down during testing?

I am communicating with Unity through the Python API. Right now I have a trained agent that I’d like to visually inspect during testing. If I do time.sleep(0.5) in Python before calling step() on the Unity environment, the StepCount increment by more than 1. It actually increases by 16, which makes me think StepCount is incremented somewhere else.

If I instead don’t have any sleep time in Python, and in C# do:

public void FixedUpdate() {
        WaitTimeInference();
    }

void WaitTimeInference() {
        stepz = StepCount;

        if (timeSinceDecision >= timeBetweenDecisionsAtInference) {
            timeSinceDecision = 0f;
            RequestDecision();
        } else {
            timeSinceDecision += Time.fixedDeltaTime;
        }
}

public override void OnActionReceived(float[] vectorAction) {
        numActions++;
}

then I get a desired result of actions being taken slower. Obviously here I have stepz which increments way faster than numActions.

Is there a better way to slow the agent down for testing? Because I obviously can’t really set MaxStep since that continues to update and putting time.sleep(0.5) in Python seems to still result in the StepCount incrementing elsewhere within Unity.

If you want all the behaviors and everything remain the same but only goes slower, that code snippet might not be exactly what you want. Since it makes the decision period longer but during each period the environment physics is still going on, the agent might perform differently if you train the agent normally and do inference using WaitTimeInference.

Have you tried modifying the timeScale or fixedDeltaTime? It slows down the time in Unity so it doesn’t affect the step counting.

It seems like when I modify timeScale it behaves as expected, but if I do what the docs suggest and also change fixedDeltaTime to be Time.fixedDeltaTime *= Time.timeScale it seems to have no affect.

Looking at the docs, 0.02 should result in 50 calls per second. If I set Time.timeScale to 0.5, then that math makes it 0.01, or 100 calls/second. Wouldn’t I want to make it Time.fixedDeltaTime /= Time.timeScale to get it to be 0.04 or 25 calls/second? Or am I misunderstanding the interraction between those variables?

EDIT: I think the documentation is wrong. Time.timeScale seems to be the only thing you need to change. If you look under https://docs.unity3d.com/ScriptReference/Time-fixedDeltaTime.html it states that it’s affected by Time.timeScale.

Oh! If you find what you feel is a mistake in the documentation, could you please fill out the Documentation Feedback form, which can be found at the bottom of every documentation page - both in the core platform docs & in package docs. This allows us to collate all feedback about a specific page in the same place, and act on all of it at once. :slight_smile: