Tensorboard when using the inference engine

I have trained a model and now only want to run it with inference. When selecting inference only, the mlagents-learn methods won’t work. This is a problem if I want to use the handy tensorboard log for monitoring environment parameters in the inference-only simulation. These environment parameters are added to a StatsRecorder and it worked very well during the training, when the mlagents-learn command was used. Is there any easy way to modify my code when running inference to store the StatsRecorder data?

I do not see why you would want to connect to TensorBoard when running inference within the Unity Engine. One hack I can see is to use the resume training functionality but set the learning rate to 0. This way the model will no longer change and the TensorBoardStats should still be written.
What is your use case for monitoring the TensorBoard stats when doing inference ?

Thanks for your reply. My reason is that I monitor environment parameters, such as population sizes, as a function of step and I use pretrained agents that I don’t necessarily want to keep training. Your solution, with setting learning rate to 0 is what I did and it worked fine but the simulations might have been faster if it was possible to run on inference only mode.

The best way to do this is to use --resume and --inference, where the trainer won’t train at all but should still record to tensorboard.

You could further speed it up by setting your batch and buffer size to something super small. Taking it a step further you could also make a “fake” agent that has very small obs/action space and set the real agents to heuristic.

Otherwise, it’s probably easier to write monitoring code inside Unity than to pass the data to Python for write out into Tensorboard.