I’ve just added memory to PPO hyperparameters:
use_recurrent: true
memory:
sequence_length: 64
memory_size: 256
I’m getting these 2 errors when generating the brain:
UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.
WARNING: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function.
Apparently, the batch size refers to the size of the input tensor for the model, not the batch_size in the PPO training configuration.
How do I save the model with a batch size of 1?
Or can all this be ignored for now?
Thanks