I’m developing a virtual whiteboard in VR. You can see an example of what I’m trying to do here:
It works great, but if I draw a line too quickly I end up with a dotted line rather than a solid line. I’m using SetPixels and Apply() which I’m told is very slow - however the framerate seems to be buttery smooth - if Apply() really was the issue wouldn’t I see low framerates? I’m using raycasting to get the coordinates the pen should draw to on the whiteboard.
An alternative I heard of is using a RenderTexture and shaders. I have a very basic understanding of shaders and Render Textures but I can’t seem to work out how to avoid using Apply(). I’ve gone through so many posts and tutorials that say using Render Textures and a shader is the way but none of them actually explain how to do it. A detailed explanation or even a minimal demo project would be extremely appreciated.
The key bit of understanding for using render textures is the blit function.
In short you create a render texture, set it to be the texture on your surface, then “draw” to it using a Graphics.Blit on your render texture using a custom shader, then you’re done.
Obviously that’s greatly simplifying it, but that should get you looking in the right direction.
The reason why people say to avoid using SetPixels and Apply is Apply has to copy the entire texture from the CPU side to the GPU where as using a render texture and blit you can send only the minimal data necessary. In your case a position on the texture, the color, and the radius to draw. You would have to then write a shader that can draw a circle with that data.
There are a number of examples on using blit to draw stuff as well as circle shaders on this forum and elsewhere on the internet so I suggest you keep looking. If you’re having problems getting something working once you’ve made a pass on it come back.
Honestly if for your use you’re not seeing any issues with SetPixels and Apply you’re probably fine continuing to use that.
The reason I want to improve the performance is I need to use a much higher resolution than I am, but when I step up the resolution with the SetPixels/Apply method frame rate becomes poor.
I’m already been experimenting with what you suggested and I have the shader drawing the blitted texture at the pens location. The problem is the brush texture just moves around wherever I point the pen, previous textures are cleared out.
Ill try to get a video to make it clearer but essentially I need to keep the the previous frame, and keep stamping the new texture onto it in the current frame. Right now I draw the blitted texture, and then it appears to be erased, and I draw another which results in a sort of laser pen behavior rather than an actual pen which would leave a trail of ink.
You should be creating a single render texture and reusing it over and over. If you’re creating a render texture each frame or using a temporary render texture those will be cleared each frame.
If you’re doing that it might be a problem with your shader. You’ll want your shader to be using an alpha blend (Blend SrcAlpha OneMinusSrcAlpha). Also depending on your shader you probably want the source texture when drawing the line to be a dummy texture; the shader should ignore _MainTex or maybe it can be the brush shape. It should not be the render texture itself. I suggest using Texture2D.whiteTexture as the source.
If you can post some snippets of the code you’re using we can help more.
If I blit a checker texture to the rendertexture and switch to a standard shader I can see the checkerboard, but when I switch to my shader I don’t see any part of the rendertextures contents at all.
Were you using a surface shader when you were using blit before? You cannot use a surface shader with blit and get expected results, you want to use as “dumb” a shader as possible, start with a unlit shader instead of a surface shader. A surface shader can be used for rendering the resulting board in the game view (and use the render texture as it’s albedo) but not the blit.
In the case of using a render texture as the main texture of a surface shader and not seeing anything get stored from frame to frame that is entirely expected because you’re never writing to the surface shader. It is not possible to read from and write to the same texture on GPUs*, and a shader that’s rendering something to the screen is rendering to the screen’s render buffer and not the render texture being passed to it.
You have to use blit, or you can go a greatly more circuitous path by setting a render texture as a camera’s target and draw things to the camera, or you can use SetPixel.
And I have to make sure my rendertexture isn’t being displayed anywhere in my scene? So it should not appear on a single material? Except as an input to the shader?
Is there any chance you could produce a minimal project for me? I feel like if I miss the tiniest thing it won’t work and I could go on trying to describe what I have here but not very effectively.
I’ve converted to a fragment shader and I’m getting the exact same results as before so there must be something I’m doing fundamentally wrong. It just has the laser pointer rather than pen behaviour.
No. Rendering is many, many hundreds, sometimes hundreds of thousands of steps. Running the fragment portion of a shaded for a single pixel is one of those steps, and during that step the same render texture cannot both be read from and written to as the output of the shader.
The fragment shader part of your blit shader should never return the main texture, it should just return a color with alpha set to zero. The blit is a very direct way to render something to a render buffer, in this case it’s more like drawing a transparent texture into the scene. The main difference between this, and say a particle effect, is the texture of the pen point is being done in the shader rather than the usual texture, and the target isn’t a camera’s view but a render texture. Also because it’s transparent and the render texture is being kept from frame to frame rather than cleared (which is what cameras do by default) it should build up over time.
Going back to another issue from your original post, the dotted line problem. I assume you’re sampling the “pen” position on update. For desktop VR this is usually 90hz, so if you move your hand more than the radius of the pen “point” within ~11 ms you’ll see the gaps. You can work around that either by sampling using FixedUpdate and setting your fixed update rate to something ridiculously high, or by detecting when the movement of the pen from the previous frame is large enough to show the gaps and draw more points along the line between. You could even write a shader that took two points and draws a straight line between them.
Yeah eventually I fixed the issue with linear interpolation between frames but I’m now wanting to improve the frame rates. I applied shader from your post to a plane, and I can see a white square on it but none of the cameras seem to be able to see it and it still doesn’t leave a trail when I change X/Y.
I’m guessing this is supposed to be wired up to a rendertexture somehow but I can’t figure it out.
Graphics.Blit takes a texture and a rendertexture, I’m not sure how I would blit the output of this shader anywhere. The shader isn’t using a texture or a rendertexture right now so how do I blit the output somewhere?
One form of blit takes a texture and a render texture. Another form takes a texture, a render texture, and a material. That’s the one you need to use; make a material using that shader and use that with blit along with your render texture.