Texture Painting performance

I’m developing a virtual whiteboard in VR. You can see an example of what I’m trying to do here:

It works great, but if I draw a line too quickly I end up with a dotted line rather than a solid line. I’m using SetPixels and Apply() which I’m told is very slow - however the framerate seems to be buttery smooth - if Apply() really was the issue wouldn’t I see low framerates? I’m using raycasting to get the coordinates the pen should draw to on the whiteboard.

An alternative I heard of is using a RenderTexture and shaders. I have a very basic understanding of shaders and Render Textures but I can’t seem to work out how to avoid using Apply(). I’ve gone through so many posts and tutorials that say using Render Textures and a shader is the way but none of them actually explain how to do it. A detailed explanation or even a minimal demo project would be extremely appreciated.

Nobody knows? Or is my explanation bad?

The key bit of understanding for using render textures is the blit function.

In short you create a render texture, set it to be the texture on your surface, then “draw” to it using a Graphics.Blit on your render texture using a custom shader, then you’re done.

Obviously that’s greatly simplifying it, but that should get you looking in the right direction.

The reason why people say to avoid using SetPixels and Apply is Apply has to copy the entire texture from the CPU side to the GPU where as using a render texture and blit you can send only the minimal data necessary. In your case a position on the texture, the color, and the radius to draw. You would have to then write a shader that can draw a circle with that data.

There are a number of examples on using blit to draw stuff as well as circle shaders on this forum and elsewhere on the internet so I suggest you keep looking. If you’re having problems getting something working once you’ve made a pass on it come back.

Honestly if for your use you’re not seeing any issues with SetPixels and Apply you’re probably fine continuing to use that.

1 Like

Thanks for the reply.

The reason I want to improve the performance is I need to use a much higher resolution than I am, but when I step up the resolution with the SetPixels/Apply method frame rate becomes poor.

I’m already been experimenting with what you suggested and I have the shader drawing the blitted texture at the pens location. The problem is the brush texture just moves around wherever I point the pen, previous textures are cleared out.

Ill try to get a video to make it clearer but essentially I need to keep the the previous frame, and keep stamping the new texture onto it in the current frame. Right now I draw the blitted texture, and then it appears to be erased, and I draw another which results in a sort of laser pen behavior rather than an actual pen which would leave a trail of ink.

You should be creating a single render texture and reusing it over and over. If you’re creating a render texture each frame or using a temporary render texture those will be cleared each frame.

If you’re doing that it might be a problem with your shader. You’ll want your shader to be using an alpha blend (Blend SrcAlpha OneMinusSrcAlpha). Also depending on your shader you probably want the source texture when drawing the line to be a dummy texture; the shader should ignore _MainTex or maybe it can be the brush shape. It should not be the render texture itself. I suggest using Texture2D.whiteTexture as the source.

If you can post some snippets of the code you’re using we can help more.

I’ll try that and if I have no luck I’ll paste some code. Thanks… and I am only using a single render texture.

I’ve done away with the brush texture, so I’m no longer using Graphics.Blit and I’m doing this in the shader but it’s still not leaving a trail:

_mainTex actually is a RenderTexture

I should be able to programmatically draw the brush like this without the need for blitting right?

Here is what it looks like visually:
http://puu.sh/pwmxx/45d92ff30b.png

If I blit a checker texture to the rendertexture and switch to a standard shader I can see the checkerboard, but when I switch to my shader I don’t see any part of the rendertextures contents at all.

Were you using a surface shader when you were using blit before? You cannot use a surface shader with blit and get expected results, you want to use as “dumb” a shader as possible, start with a unlit shader instead of a surface shader. A surface shader can be used for rendering the resulting board in the game view (and use the render texture as it’s albedo) but not the blit.

In the case of using a render texture as the main texture of a surface shader and not seeing anything get stored from frame to frame that is entirely expected because you’re never writing to the surface shader. It is not possible to read from and write to the same texture on GPUs*, and a shader that’s rendering something to the screen is rendering to the screen’s render buffer and not the render texture being passed to it.

You have to use blit, or you can go a greatly more circuitous path by setting a render texture as a camera’s target and draw things to the camera, or you can use SetPixel.

So I’d need to convert this to a fragment shader right?

If you’re using blit, yes.

And I have to make sure my rendertexture isn’t being displayed anywhere in my scene? So it should not appear on a single material? Except as an input to the shader?

Is there any chance you could produce a minimal project for me? I feel like if I miss the tiniest thing it won’t work and I could go on trying to describe what I have here but not very effectively.

I’ve converted to a fragment shader and I’m getting the exact same results as before so there must be something I’m doing fundamentally wrong. It just has the laser pointer rather than pen behaviour.

Here’s the fragment shader:

Shader "fragshader"
{
    Properties
    {
        _penColor("Color", Color) = (1,1,1,1)
        _MainTex ("Texture", 2D) = "white" {}
        _penX("Pen X", Range(0,1)) = 0.0
        _penY("Pen Y", Range(0,1)) = 0.0
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100
        Blend SrcAlpha OneMinusSrcAlpha
        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            // make fog work
            #pragma multi_compile_fog
         
            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            fixed4 _penColor;
            float _penX;
            float _penY;
         
            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                UNITY_TRANSFER_FOG(o,o.vertex);
                return o;
            }
         
            fixed4 frag (v2f i) : SV_Target
            {
                float x = i.uv.x;
                float y = i.uv.y;

                float penSize = 0.01f;
                float halfPen = penSize / 2;

                if (x > (_penX - halfPen) && x < _penX + halfPen && y >(_penY - halfPen) && y < (_penY + halfPen)) {
                    return _penColor;
                }
                else {
                    return tex2D(_MainTex, i.uv);
                }

                UNITY_APPLY_FOG(i.fogCoord, col);
            }
            ENDCG
        }
    }
}

No. Rendering is many, many hundreds, sometimes hundreds of thousands of steps. Running the fragment portion of a shaded for a single pixel is one of those steps, and during that step the same render texture cannot both be read from and written to as the output of the shader.

The fragment shader part of your blit shader should never return the main texture, it should just return a color with alpha set to zero. The blit is a very direct way to render something to a render buffer, in this case it’s more like drawing a transparent texture into the scene. The main difference between this, and say a particle effect, is the texture of the pen point is being done in the shader rather than the usual texture, and the target isn’t a camera’s view but a render texture. Also because it’s transparent and the render texture is being kept from frame to frame rather than cleared (which is what cameras do by default) it should build up over time.

Shader "Pen Blit"
{
    Properties
    {
        _penColor("Color", Color) = (1,1,1,1)
        _penX("Pen X", Range(0,1)) = 0.0
        _penY("Pen Y", Range(0,1)) = 0.0
    }
    SubShader
    {
        Blend SrcAlpha OneMinusSrcAlpha
        // No culling or depth
        Cull Off ZWrite Off ZTest Always

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
          
            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
                o.uv = v.uv;
                return o;
            }

            fixed4 _penColor;
            float _penX;
            float _penY;

            fixed4 frag (v2f i) : SV_Target
            {
                float x = i.uv.x;
                float y = i.uv.y;

                float penSize = 0.01f;
                float halfPen = penSize / 2;

                if (x > (_penX - halfPen) && x < _penX + halfPen && y >(_penY - halfPen) && y < (_penY + halfPen)) {
                    return _penColor;
                }

                return fixed4(_penColor.rgb, 0.0);
            }
            ENDCG
        }
    }
}

Note this shader doesn’t even care if _MainTex exists, it’s completely ignored.

1 Like

Going back to another issue from your original post, the dotted line problem. I assume you’re sampling the “pen” position on update. For desktop VR this is usually 90hz, so if you move your hand more than the radius of the pen “point” within ~11 ms you’ll see the gaps. You can work around that either by sampling using FixedUpdate and setting your fixed update rate to something ridiculously high, or by detecting when the movement of the pen from the previous frame is large enough to show the gaps and draw more points along the line between. You could even write a shader that took two points and draws a straight line between them.

Yeah eventually I fixed the issue with linear interpolation between frames but I’m now wanting to improve the frame rates. I applied shader from your post to a plane, and I can see a white square on it but none of the cameras seem to be able to see it and it still doesn’t leave a trail when I change X/Y.

I’m guessing this is supposed to be wired up to a rendertexture somehow but I can’t figure it out.

That shader shouldn’t be used on a plane, it should be used with a blit command.

Graphics.Blit takes a texture and a rendertexture, I’m not sure how I would blit the output of this shader anywhere. The shader isn’t using a texture or a rendertexture right now so how do I blit the output somewhere?

One form of blit takes a texture and a render texture. Another form takes a texture, a render texture, and a material. That’s the one you need to use; make a material using that shader and use that with blit along with your render texture.