We are trying to load a ViT-based unetr AI model using Sentis and would like to know if they are supported. The Onnx file is ~300MB, and the model has 87M parameters. A smaller unet ( ~6MB with 1.6M parameters) worked fine.
When I run the model with Sentis it is not giving any error/warning but it is also not giving the same result as running in pytorch. Seems like I’m just getting a random noise pattern.
When I select the model in Assets folder, no error/warning show up in the inspector tab.
I’m using Unity 2022.2.20 and Sentis 1.1.0-exp.2
Thanks for letting us know. Could you post a link to model please?
Thank you for the reply.
I really would like to share the model but it is prepared inside the company so its hard for me to do so.
I know its difficult for you to trouble shoot without looking at the actual model but it would be great if you happened to have any insights on running unetr or ViT-based model with Sentis.
It is also great if you could suggest me some steps to trouble shoot when the model is not giving a same result as in PyTorch.
The model didn’t run with Barracuda because the operator “Einsum” was not supported.
I switched to Sentis and now the model is running without any error but not giving the expected output.
We do have
ViT_ade20k_seg_tiny in our test suite.
Typically our debugging process goes as follows:
- insert temporary output mid way of your model
- run inference in python, save input output
- run inference in Sentis, with saved input and validate the output
- binary search up/down the model until you hit a layer that produces a wrong result in regards to sentis
You can try to disable optimizations in the UI to see if it helps
Thank you for the suggestions!
I’ll definitely try those things.
Just so you know, I updated Sentis package to 1.1.1-exp.2 and now the model is working as expected.
Thank you for whatever update you did!
That’s great to hear. Thanks for updating and trying out 1.1.1.