I’ve created a system to capture a sprite (or set of sprites) and then redraw that capture to the screen. NGUI is being used to manage the sprite rendering although I’m not sure it is important to the problem I am having.
The issue is that when the capture is drawn to the screen, blending seems to be incorrect in areas where a transparent image is drawn on top of an opaque one. The image below shows the issue.
The character on the left is drawn with NGUI straight to the screen, it is how it is supposed to look. The face sprite is drawn first and then the makeup on top, all blended together.
The character on the right is one single image, created with the capture system. The blending is incorrect because wherever there is makeup on the face it is semi-transparent. As you can see from the image, the green box in the background is visible through the face.
Here is the process used to create the new image:
Render only the character to a render texture with a camera that has it’s clear flags set to a solid color which is transparent.
Read the pixels of the render texture into a Texture2D so that it can be read from. This is now the source Texture2D.
Find the portion of the texture that the character is in and copy the pixels from the source Texture2D to the destination Texture2D. The destination Texture2D is a new atlas of images.
NOTE: Source and destination Texture2D objects have a transparent background.
It seems as if the alpha value of the topmost sprite is being picked up, either at the RenderTexture or ReadPixels stage, rather than the blended value. Has anyone who has experienced this type of problem before got a solution to it?
I know this thread is kinda old and this case is very specific, but… I’m struggling with the same problem, without any result. Have you done any progress on this?
This is something that is really missing from Unity - combining textures together at runtime to make new images - simple texture blitting. You can use Texture2D and read/write pixels if you understand the RGB math but it’s slow and requires all textures have read/write enabled - there should be an optimised built in way to merge textures at runtime.
Hopefully this will help someone else out, though I don’t think this directly addresses the original problem.
I was seeing a very similar result using Unity 5.5.0f3 building to Android when using ReadPixels to create an image for sharing (think Crossy Road’s game over snapshot). In the following screenshot, if you look at the white framed picture in the top right you can see a faint rectangle behind the character:
To fix this problem I only needed to make 2 changes:
Remove Alpha channel from Texture2D creation
Apply the Texture2D directly to the RawImage called “screenshotPicture” (instead using my “renderTexture”)
Here is the resulting code that removed the rectangle described above:
//Setup for textures
RenderTexture.active = renderTexture;
// Do not want alpha from screen, causes weird splotching issues on Android
Texture2D screenTexture = new Texture2D(sharingCam.targetTexture.width, sharingCam.targetTexture.height, TextureFormat.RGB24, false);
//Grabbing texture info
sharingCam.Render();
screenTexture.ReadPixels(new Rect(sharingCam.pixelRect), 0, 0, false);
//Resetting + Applying
screenTexture.Apply();
RenderTexture.active = null;
//Storing for share
stored_screenShot = screenTexture;
sharingCam.enabled = false;
screenshotPicture.texture = stored_screenShot;
Notes:
My setup involves using 2 cameras, but that it is possible to implement this with only 1. We are only using 2 cams for purposes of getting the picture to look funnier
This problem was not visible in Editor, and only seemed to happen on some Android devices. Did not test on iOS.
I had the same issue, even with the new ScreenCapture.CaptureScreenshotAsTexture() and Read.Pixel methods; it came out to be a little transparent. However, when i saved the image in a file (like jpg or png), and loaded again, it didn’t show any transparency. Its not very performant though, hope it helps and looking for better solution:
public Image ScreenshotBG; // UnityEngine.UI Image file type
void TakeScreenShotInvoke()
{
StartCoroutine(takeScreenShot());
}
IEnumerator takeScreenShot()
{
Texture2D spriteTexture = new Texture2D(Screen.width, Screen.height);
string filePath = Application.persistentDataPath + "/Screenshot.jpg";
ScreenCapture.CaptureScreenshot(filePath); // waiting for a while
Debug.Log("Screen shot at: " + filePath);
yield return new WaitForEndOfFrame(); // waiting for a while again just in case
byte[] fileData;
if (System.IO.File.Exists(filePath))
{
fileData = System.IO.File.ReadAllBytes(filePath);
spriteTexture.LoadImage(fileData);
yield return new WaitForEndOfFrame(); // waiting again :(
}
ScreenshotBG.gameObject.SetActive(true);
ScreenshotBG.sprite = Sprite.Create(spriteTexture, new Rect(0, 0, spriteTexture.width, spriteTexture.height), Vector2.zero);
}