As the title implies, I am working on a pipeline for taking a model trained in ML-Agents and exporting it such that I can run inference with the model from inside a python script using tensor flow. Informed by this and this forum post, my planned pipeline is:
- export model trained in ML-Agents to ONNX using tf2onnx
- import onnx model in python script
- create tensorflow representation in python with onnx-tx
- run inference using tensorflow similar to tutorial here: tutorials/tutorials/OnnxTensorflowImport.ipynb at master · onnx/tutorials · GitHub
Steps 1 and 2 seem to work but I am stuck on an error at step 3 when trying to create a tensorflow representation of the onnx model created in ML-Agents. Given a trained onnx model “model.onnx”, here is a simple python script that reproduces the error:
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load('model.onnx')
tf_model = prepare(onnx_model)
The error I receive:
Traceback (most recent call last):
File “/home/ross/miniconda3/envs/autofly_py3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py”, line 1654, in _create_c_op
c_op = pywrap_tf_session.TF_FinishOperation(op_desc)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Input must be scalar but has rank 1 for ‘{{node one_hot}} = OneHot[T=DT_FLOAT, TI=DT_INT32, axis=-1](strided_slice__20, const_fold_opt__56, strided_slice_3, strided_slice_2)’ with input shapes: [2147483647], [1], [ ], [ ] and with computed input tensors: input[1] = <2>.
I realize that my problem may well be specific to my model.onnx file, but I am posting in the hopes that this is a more general problem others have seen and can help fix. For reference, I am using:
python==3.7.6
onnx=1.7.0
onnx-tf @ git+https://github.com/onnx/onnx-tensorflow@44c09275a803e04eeeb4e0d24c372adf1f9ff1f5
tensorboard==2.2.2
tensorflow==2.2.0
tensorflow-addons==0.10.0
tensorflow-estimator==2.2.0