I have a Material that use a custom Shader that i write. The shader mainly about setting the o.Albedo. And in Unity i have a Mesh Renderer component with the Material above. I try saving the color (or texture) of the Mesh (Mesh Filter only hold the parameter for the mesh not the color) by saving the “sharedMaterial.mainTexture” but it always be null. Is there a way to save the color (or texture) of the Material with the custom Shader?
Here are main part of my Shader:
void surf (Input IN, inout SurfaceOutputStandard o) {
float heightPercent = inverseLerp(minHeight,maxHeight, IN.worldPos.y);
float3 blendAxes = abs(IN.worldNormal);
blendAxes /= blendAxes.x + blendAxes.y + blendAxes.z ;
for (int i = 0; i < layerCount; i ++) {
if(baseTextureActive*){*
float drawStrength = inverseLerp(-baseBlends/2 - eps, baseBlends_/2, heightPercent - baseStartHeight );
float3 baseColour = baseColorbaseColorStr;
float3 textureColor = triplanar(IN.worldPos,baseTextureScale, blendAxes, i)(1-baseColorStr*);
o.Albedo = o.Albedo * (1-drawStrength) + (baseColour+textureColor) * drawStrength;
}
else
continue;
}
}*
As can see i only set the o.Albedo value, nothing else.
P/s: English is not my native language so sorry for the inconvenience._
Yes, but also no.
First the no.
The shader never sets the texture on the material. In fact it’s not possible for shaders to do that. The material you access from c# is the CPU side data that’s being sent to the GPU. The GPU then runs that shader for each vertex & visible pixel and finally outputs a color to the current render target (usually the screen). It then throws all that work away and does it again the next frame. Normally the CPU never gets any of that data.
So you’re never creating or writing to a texture that can be mapped back into the mesh, just calculating the color for that position on screen.
Now the yes.
You can use a shader to generate a texture for an object, but there’s a lot of setup you need to do first, not to mention you need an even more custom shader than you’re currently using.
First you need a mesh with unique UVs for the entire surface. No overlaps. Lots of art tools have functions for generating these, or you could use Unity’s auto generated lightmap UVs in the mesh import settings.
Then you need a custom shader that uses those UVs to render the mesh into screen space unwrapped instead of using the actual vertex positions. You’ll still need to calculate the other vertex data being used by the shader (world position, world normal). And then you need to calculate that albedo color and output that.
Lastly you need a render texture that you manually render your mesh using that shader to, which will produce a texture that you can copy back to the CPU side using ReadPixels() into a Texture2D. Then you can use that texture in another shader instead of calculating everything.
TLDR: texture baking is the term you want to search for.