I would like to combine two render textures (a: video camera image, b: segmentation mask of person in video camera image; coming from ARKit, btw) into one render texture using Graphics.Blit with a material.
How would I go about this? I know that I can do the shader with Amplify Shader Editor, but how would I pipe the textures into the shader with Graphics.Blit?
By default blit takes two textures, a source texture and a destination render texture. Behind the scenes this is creating a material using a default blit shader and setting the _MainTex property as the source texture.
Alternatively you can pass in your own material using your own shader that uses whatever properties you want. You just need to create the material, set the properties, and blit using the blit function that takes two textures and a material.
Alternatively you might look at using Graphics.CopyTexture() which you can use to copy one texture into another of they use exactly the same texture format. And you can copy to a region rather than the entire texture.
@bgolus Thank you very much! In both cases, how would I be able to feed “RenderTexture a” and “RenderTexture b” into it at the same time, as they need to be synced?
Specifically, quoting you: “and blit using the blit function that takes two textures and a material.” Which blit function takes two textures?
Or do I need to do Graphics.Blit twice (one time for “RenderTexture a” and one time for “RenderTexture b” as source textures)? But if so, how do I specify the exact destination texture of my custom shader? E.g. custom shader has two texture properties left and right?
Understand most blit functions take two textures, the source and destination. No blit function takes two input textures, which is what I’m guessing you thought I meant.
As for keeping them “in sync”, there’s nothing you need to do for that really. That just depends on when you call blit and when the other two render textures are rendered to.
That is what the blit function sets when you pass in a second render texture. Hence why it’s referred to as the destination texture in the documentation.
However I’m guessing what you meant to ask was how to set where within that destination texture you render to. That’s down to your custom shader manipulating the UVs and swapping between the two input textures at the mid point using an if or lerp.
However I’m going to go back to my original post’s suggestion and say skip using blit entirely and use CopyTexture.
Hello,emm.I have a same question.I want to conbine two Texture2D with different resolution .At first, I use system.drawing to draw them to one texture,but after packed,an exception occured.Now,i want to use rendertexture to conbine two textures together,and the transparent part of the foreground texture cannot cover background texture.Can you tell me how to do? Thanks.
What do you mean system drawing, and what’s the exception?
You can either render the base texture to the render texture with a Blit, and then Blit the foreground texture over that with a basic alpha blended shader. The mobile transparent shader should work.
Alternatively you could write a custom shader that takes two textures and leaps between them (which is the same math as alpha blending) and write the output to a render texture with a Blit.
I have a shader with main and second texture and two different rendertargetidentifier. I need to process them with Blit at the same time on that shader. What is the easiest and best way to do it?
I can’t set with CommandBuffer.SetGlobalTexture. It works if assign texture manually.
Ok got it. Removed Texture from shader properties worked.