Hi there,
In the newly updated BlazePose example, there are affine transformations performed via Compute Shader: unity/sentis-blaze-pose · Hugging Face
This transforms a Texture2D to a Tensor after applying the AffineTransformation. Could you please provide a function or help me out understand how we can reverse this process? How do we compute the Inverse matrix for the affine transformation and how do we apply it? Is it possible via the same function?
Thanks in advance.
The affine transformation we have in this case is actually for sampling the texture given the tensor coordinate, so in a sense it is already the “inverse” affine transformation matrix if we are thinking of it as
tensor = affine(texture, matrix)
if you want the inverse of a transformation matrix you can either
- run the steps that were made to create the matrix in reverse with inverses. e.g. Rotate(Translate(x), theta) becomes Translate(Rotate(-theta), -x)
- invert the final matrix using a formula such as Inverting a 3x2 Affine Transformation Matrix | Nigel Tao or use another method such as Gauss-Jordan if the formula has numerical instability.
are you still trying to go from a texture > tensor or are you trying to write to a texture as the output? for the latter you can first create a tensor and then use the TextureConverter.RenderToTexture method to draw your texture.
1 Like
To prepare the input tensor for the Landmark Estimation model, we apply an affine transformation on the input texture resulting in a tensor fed to the Landmark Estimation model. Now, I wanted to reverse this process, to obtain a texture from such a tensor by applying the inverse affine transformation to this tensor.
Theoretically, you would find the inverse of the following matrix M2 in the code:
var M2 = BlazeUtils.mul(BlazeUtils.mul(BlazeUtils.mul(BlazeUtils.TranslationMatrix(kp1_ImageSpace), BlazeUtils.ScaleMatrix(new float2(scale2, -scale2))), BlazeUtils.RotationMatrix(0.5f * Mathf.PI - theta)), BlazeUtils.TranslationMatrix(-origin2));
BlazeUtils.SampleImageAffine(texture, m_LandmarkerInput, M2);
And then transform the m_LandmarkerInput using the inverted M2 matrix to obtain a texture. That is my goal. Any help will be appreciated @gilescoope, since I am not familiar with compute shaders and the BlazeUtils.SampleImageAffine
transformation uses a compute shader to apply this affine transformation.
If you want to get the transformed texture from the tensor, just use the TextureConverter method to render your tensor to a render texture, no transform required.
You should only want to apply the inverse affine transform if you want the texture to be in its original coordinate system. Are you sure this is what you want? You already have that original texture in the example you mention. Your rendered texture would just look exactly like the original texture only with ugly scaling artifacts and potentially some areas blacked out due to the cropping. What would you need this for?
If you really do want this then you can use either of the methods I mentioned above to get the inverted matrix. You can either use this matrix to transform between two tensors, using a compute shader, and then use the TextureTransform method to write to a texture. Or write your own compute shader to write to a texture.