Rendering A Camera at Half Res

I already had this working, but with the upgrade to Unity 5, this is now broken.

The idea is to render my GUI at retina resolution, but the 3D world behind the game at half resolution.

This was fairly easy to do:

  1. change the camera’s viewport to make it render the scene to the bottom left quarter of the buffer
  2. use post processing to stretch the small image to the full screen size when blitting.

The script on the camera:

public float renderScale = 1.0f;
private Material cameraMaterial;

void OnPreRender() {
    cameraMaterial.SetFloat("_InvRenderScale", renderScale);
    GL.Viewport(new Rect(0, 0, Screen.width * renderScale,
                                Screen.height * renderScale));
}

void OnRenderImage(RenderTexture source, RenderTexture destination) {
    Graphics.Blit(source, destination, cameraMaterial);
}

void OnPostRender() {
    GL.Viewport(new Rect(0, 0, Screen.width, Screen.height));
}

And the relevant bits of the shader for cameraMaterial:

half _InvRenderScale;
          
v2f vert(appdata_base v)
{
    v2f o;
    o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
    o.uv = half2(v.texcoord.x * _InvRenderScale,
                 v.texcoord.y * _InvRenderScale);
    return o;
}

half4 frag (v2f i) : COLOR {
    return tex2D(_MainTex, i.uv);
}

Finally the problem is that Unity doesn’t respect my call to GL.Viewport() in OnPreRender().

So instead of rendering the scene to the bottom left 1/4 of the screen then stretching it, it renders the scene full screen and then stretches the bottom left 1/4 to full screen.

How can I work around this issue?
And is this a bug, or by design?

There is something like render quality in Dota 2 or something… if you mean that, I would like to be able to degrade camera randering resolution. It would be great to have as an easy to setup feature.

I submitted a bug for this.
If anyone else knows a workaround for GL.Viewport not working, it would be much appreciated.
Again, I can’t use Camera.rect because that applies a clipping rectangle preventing the upscaled image from showing outside the bottom left 1/4 of the screen.

This would give serious boost to performance. Also, there is an image effect that makes everything look “pixelated” and like low res old games. It would be just easyer to scale down game resolution, but the camera would still render at high res… serious issue.

Why do you not use RenderTextures?

I finally solved this. There were many problems involved in the workaround.

Basically, I had to add a second orthographic camera to the scene which looks at a plane, which shows a render texture which is the target of the main camera. The ortho camera’s depth is set to render after the main camera. So the main camera renders to the render texture, and then the orthographic camera renders the plane with the main camera’s render texture on it.

At first, I just used a render texture that was 1/4 the size of the screen, but that caused problems. Camera.WorldToScreenPoint() was giving me points that mapped to the bottom left 1/4 of the screen, because they were scaled for the viewport of the render texture. So I made the render texture the same size as the screen and rendered to to the bottom left of it like my original solution. At this point though, GL.Viewport was still no help for rendering to the bottom left of the render texture, but since I had set up a second ortho camera to render a plane instead of using post processing on the main camera, Camera.rect was fine for this purpose. Camera.rect’s clip rectangle only affected the main camera and not the secondary ortho one. So I applied the same shader I was previously using for post processing to the plane in front of the ortho camera, which upscaled the bottom left 1/4 of the render texture to the full plane.

Of course, layers had to be set up so the plane would not render on the main camera and the scene would not render on the ortho camera. Also, I set clearFlags to none for the secondary camera. It didn’t seem to work right otherwise.

1 Like