Texture Array to tensor in Sentis for working with multiple Rendertextures

Previously in Barracuda it was possible to do this(pseudocode):

[SerializeField] private NNModel modelAsset;

private void Start()
{
   Model m_RuntimeModel = ModelLoader.Load(modelAsset);
   IWorker m_Worker = WorkerFactory.CreateWorker(WorkerFactory.Type.ComputePrecompiled, m_RuntimeModel);
   Texture[] textureArray = new Texture[2];
   textureArray[0] = GetRenderTextureFromSomewhere();
   textureArray[1] = GetRenderTextureFromSomewhere();
   Tensor input = new Tensor(textureArray);
   m_Worker.Execute(input);
   //...get output and etc.
}

Today I tried Sentis for the first time, after reading the documentation but I wasnt able to find something similar.
What is the current way in Sentis to transform texture array(with varying length) of rendertextures into a tensor?

1 Like

We removed that constructor, bu you can still do it.
I’d refer you to two sample that shows you how to manipulate tensors/textures
TextureToTensor/ExecuteOperatorOnTensor

var op = new GPUComputeOps(...);
for (int i = 0; i < textureArray.Count; i++)
  tensorArray[i] = TextureToTensor(textureArray[i], ...)
 input = op.Concat(tensorArray, axis: 1);
2 Likes

But how should I get the correct output ?

I am trying to do the example of mediapipe iris,
i want to predict two eye images in one Inference,
so i need to send 2 textures into the model at the same time.

I concat 2 images
the shape of input seems changed, (1,3,64,64) to (1,6,64,64)
but I got the same output shape (1,213),
I thought it should be something like (2,213)
is there anything i should do for peekoutput ?
thank you very much~

you need to concat on the batch not channels