Hi Everyone,
I’m looking for a way to have a user take a headshot picture let’s say on a white/black/green background, key out the background, and then use the headshot with alpha on a player texture (head only). I’d be interested if anyone has any thoughts on how to even approach this - through a realtime alpha shader? through the greyscale as alpha texture importer (i.e. maybe save a picture to the device and then read it back in as a texture?), or maybe as a pre-defined alpha that I can merge with the new texture? Something else? I’d be interested in any thoughts…
If anyone is interested in this, I’m getting closer… It works currently, but the textures are building in memory and eventually will crash. For some reason the Destroy I have on the main texture is not currently working… I’m still working through the profiler and looking for problems here…
Anyway, what I ended up doing was using a masked texture shader I found on the wiki:http://www.unifycommunity.com/wiki/index.php?title=TextureMask…
It has a main texture and then an alpha cutout texture. Works great for predefining the alpha mask. When the texture is loaded back in it’s loaded into the main texture only, and the alpha can be pre-made.
So here’s the code I’m using - This assumes you have your iPhone camera rendering to a RenderTexture already.
This then saves a still render texture to the /Documents folder on your device (inside the application folder), and then loads it back in as a texture2D that can be mapped to a GameObject of some sort…
#pragma strict
// Saves screenshot as PNG file.
import System.IO;
function Update () {
if(iPhoneInput.touchCount > 0)
// if (Input.GetKeyDown ("space"))
{
UploadPNG ();
}
}
function UploadPNG () {
// We should only read the screen bufferafter rendering is complete
yield WaitForEndOfFrame();
// Create a texture for the screen size. In my case I have a preview area on the left (200 pixels)
var width = Screen.width-200;
var height = Screen.height;
var tex = new Texture2D (width, height, TextureFormat.RGB24, false);
// Read screen contents into the texture
tex.ReadPixels (Rect(200, 0, width, height), 0, 0);
tex.Apply ();
// Encode texture into PNG
var bytes = tex.EncodeToPNG();
Destroy (tex);
// write to a file in the project folder
File.WriteAllBytes(Application.persistentDataPath + "/screenshot.png", bytes);
}
Then load it back in to a texture:
#pragma strict
var cube : GameObject;
function Update (){
if(iPhoneInput.touchCount > 0)
LoadScreenShot();
}
function LoadScreenShot () {
Destroy (cube.renderer.material.mainTexture);
var fileName = Application.persistentDataPath + "/screenshot.png";
var www:WWW=new WWW("File://"+fileName);
yield www;
cube.renderer.material.mainTexture = www.texture;
www.Dispose();
www = null;
}
Fixed it. Just needed a Resources.UnloadUnusedAssets() call at the end of the LoadScreenShot code above… running stable now.