ReadPixels requires an offset while in the Editor?

I'm trying to capture the screen to a `Texture2D` so I can draw some custom pixels over the image and then display the frozen image back to the player as a `GUITexture`. When testing my capture code it seems the `captureRegion` requires an offset equal that to the offset of the game screen in the Unity Editor, rather than simply `Rect(0,0,Screen.width,Screen.height)`. So if I set the Game view to "iPad Tall (768x1024)", because of my monitor layout there is a buffer of pixels around the image - and that buffer is being captured. Repeated captures will cause the image to shift up and to the right with the dark grey game view boundary repeating. Oddly, sometimes my Inspector even gets captured as a partially transparent image and overlaid on the `GUITexture`.

Am I doing something wrong?? What do I do instead? Thanks!

var captureRegion:Rect = Rect(0,0,Screen.width,Screen.height);
function CaptureScreen()
        captureScreen = false;
        guiTxtr.enabled = false;

        txtr = new Texture2D(Screen.width, Screen.height);
        txtr.filterMode = FilterMode.Point;     
        txtr.ReadPixels(captureRegion, 0, 0);

        guiTxtr.enabled = true;
        guiTxtr.texture = txtr;

I have also had this problem, it seems that it take the entire game window tab rather than the actual window. I just resize my window to the fixed resolution or just set it to free aspect. This is probably a bug report worthy thou..

[Side note] When taking screenshots I would recommend using the full Texture2D constructor as right now you have mipmaps on and have a alpha channal.

txtr = new Texture2D(Screen.width, Screen.height, TextureFormat.RGB24, false);

It could also be because you are not waiting for the end of the frame, so the camera is not actually rendered.

pop this in before captureScreen = false;

// We should only read the screen buffer after rendering is complete
yield WaitForEndOfFrame();

you can read more here