I would like to measure the mean length of my episodes in seconds.
I’m training an agent and from the “Episode length” plot on Tensorboard I read a value = 4.5. This should mean that the decision requester makes 4.5 decision per episode on average (see previous thread Unit of measure of Episode Length on Tensorboard ).
My value for decision requester is 12, so this means that agent makes a new decision every 12 steps.
My fixedDeltaTime is equal to 0.0133.
Hence, I suppose that the mean episode length in seconds should be (0.0133 * 12) * 4.5 = 0.71 s.
0.0133 * 12 describes how much time in seconds passes between two consecutive decisions.
This number * 4.5 should be the desired mean length in seconds.
Is this computation correct?
This seems correct to me, but looking at the training it seems like my agent moves lower than expected.
Thank you very much.