Here, the cube is rendered with default HDRP/Lit material. Sphere is rendered with completely blank HD/Lit ASE shader. AO is from HBAO asset. When changing the material on sphere to blank Shader graph shader, the problem goes away.
Hello,
I have a simple sprite shader that is ignoring white pixels using the Compare node. Problem is, it ignores other colors that are “close to” white. Is there some way to adjust the precision or achieve the required results a different way? Thanks in advance!
I’ve looked at the skybox example and what I want to do is, instead of sampling a texture, I want to sample a specific realtime reflection probe. How can I do this in built in renderer?
Hi, I am using Built-in Shaders. How can I have the shadow render differently to main shader? For example, I have a shader which use Dithering for transparency effect, but I want the shadow it casts to be solid, and not with holes.
I might be wrong but I think I remember “Rendering Options > Use Default Shadow Caster” does this if you check it. Otherwise, you can always add a static switch for the “UNITY_PASS_SHADOWCASTER” #define and have it affect the dithering differently.
Actually, I think you have to edit the resulting shader manually and remove the “addshadow” pragma if you use the first approach (use default shadow caster).
Thanks, being new to Amplify, can you walk me through the detail a bit? With the second approach, I know how to add static switch, but the rest is not clear to me.
I’m having an issue with normal map blending. I created a shader with a trim sheet normal map mapped to UV1 and a detail normal map on UV3, which works out great except that the swizzle of the detail normal seems to be affected by the UV1 coordinates
I checked and double-checked that the normal map is set up correctly, and then I discovered if I went and rotated this area in UV1 180 degrees, the normals were fixed:
I’m a little confused by that because these are tangent-space normal maps and I thought the rotation of a UV shouldn’t actually have any effect on the shading. Here’s my node setup:
You can’t rotate a tangent space map w/o affecting the lighting.
I’m wondering if one of those maps has incorrect DirectX vs OpenGL normals? Have you tried multiplying the Y-channel with -1 before blending the normals? Or just invert the green channel with levels in photoshop.
Thanks for replying. That was my first guess as well, but both maps are +Y so that can’t be it.
I should’ve specified, I didn’t rotate the normal map, I rotated the UVs in UV1 and the normal map that uses UV1 is completely flat there.
That’s what I meant tho, you can’t rotate the UVs w/o it affecting the map. If you rotate the UVs you would have to rotate the tangent and bitangent the same amount, otherwise the map won’t be interpreted the same.
Not yet, hopefully in an upcoming update. (we’re a bit behind, shorthanded)
Depends, is this Reflection probe affecting the object that will use the shader?
You might need something custom to assign the texture to your shader; if using an Indirect Specular Light node is not feasible.
Not necessarily, what exactly do you intend to automate?
Not sure, feel free to send a sample to support@amplify.pt