Steps I’ve taken:
- Imported a simple white circle image with a transparent background
- Assigned the circle image to an Image Component (attached to an overlay canvas)
- Set the Image component to match the native size of the image
- Added a script to the image component which does the following:
UIImageToRenderTexture.cs
using UnityEngine;
using UnityEngine.UI;
[ExecuteInEditMode]
[RequireComponent(typeof(Image))]
public class UIImageToRenderTexture : MonoBehaviour
{
Image input;
public RenderTexture output;
void Start()
{
input = GetComponent<Image>();
}
void Update()
{
Graphics.Blit(input.sprite.texture, output);
}
}
- I then created a new RenderTexture asset, giving it the same dimensions of the Circle image, and assigned it to the “output” variable of the script on the component
The expected result was having the contents of the RenderTexture asset be exactly the same as the Image Component, for further modification within a material/shader.
But instead, what I got was this:

It’s supposed to be circular, but it’s more like a zoomed-in octagon.
And the background isn’t transparent, it’s black…
What am I doing wrong?
Note that im doing this because I want to make use of the Filled Radial modes of the Image component for further modification within a material/shader