Object Position in a surface shader?

The documentation for surface shaders ( Unity - Manual: Writing Surface Shaders ) doesn’t list any input value for the object-space position (worldPos is listed, but not object position). I’ve tried “vertex” and “objectPos”, which are both accepted and both produce the same results, but it seems to be the UV coordinates rather than the object space coordinates.

v.vertex is the object-space position. You might have to pass it through via a custom vertex shader, though. I don’t think it comes through automatically.

I’ve been able to add a custom vertex program to a surface shader, but I can’t find any indication in the docs or forums on what variable to use to output the object position (normally you’d just use a texcoord, but with a surface shader you can’t just use whatever you want)

Another thing which I’m confused about: the only way I’d been able to get the correct value in my other shader was to use the following two lines of code in the vertex program:
o.worldPos = mul (_Object2World, v.vertex);
o.objectPos = mul(_World2Object, o.worldPos);

Then I used objectPos in the fragment shader to provide coordinates for a procedural 3D pattern which remains stable even if the object moves around (i.e. rather than using worldPos, which of course causes the pattern to shift if you move the object). But it doesn’t work if I just use v.vertex instead of the above two calculations - which doesn’t make sense if v.vertex is the object-space position. Why do I need to convert that to worldspace and then back again to objectspace to get it to work, if v.vertex is already the object-space position ?

The surface shader will set up the texcoord for you.

But you should end up with something like… (this is a very trimmed down version);

Shader "Custom" {
    Properties {
    }
    SubShader {
        Tags { "RenderType"="Opaque" }
        LOD 200
 
        CGPROGRAM
        #pragma surface surf Lambert vertex:vert 
 
        struct Input {
            float3 objPos;
        };
 
        void vert (inout appdata_full v, out Input o) {
            UNITY_INITIALIZE_OUTPUT(Input,o);
            o.objPos = v.vertex;
        }
 
        void surf (Input IN, inout SurfaceOutput o) { 
            o.Albedo = IN.objPos;
            o.Alpha = 1;
        }
        ENDCG
    } 
}
1 Like

Thank you. That works, although I had to take out UNITY_INITIALIZE_OUTPUT(Input,o); because it was generating a syntax error. It seems to work without it… although I don’t know whether it’ll cause problems down the road.
I also had to replace o.objPos=v.vertex with the following two lines (as with my other shader):
o.objPos = mul (_Object2World, v.vertex);
o.objPos = mul(_World2Object, o.objPos);
This allows the procedural 3D pattern to remain stable even when the object moves (which is why I needed the object-space position), but using just the raw v.vertex value creates a mess - the procedural pattern is reduced in size to a tiny scale, and it shifts when the object moves. This also happened in the other shader I was experimenting with, until I added the above two lines. I have no idea what’s going on, but these two lines work for some reason. Again, I don’t know whether it’ll cause problems later on, especially on other machines?

I suspect you have batching turned on? That’ll screw with the object space position as it combines and uncombines meshes.

That may be. Thank you.

I’m hoping you can answer one more question if possible: why do I get facets instead of smoothed normals when an object is at certain points on the map (such as -1000,-1200,5000) but not when the same object is near 0,0,0 ? I also get strange pixelation for the textures and bumpmapping in the same parts of the map, but again not when the same object is near 0,0,0. It doesn’t make any difference what shaders I use on the object, since even the built-in Unity shaders produce the same strange pattern.

I don’t have an answer for you, but I can confirm I’m seeing similar artifacts at more extreme coordinates. My shader is custom vertex/fragment with bump mapping, and it’s producing strange texture distortion which flickers aggressively on camera movement when coordinates are in the ranges you describe. It’s comforting to know the built-in shaders do the same - I was worried I’d have to do some seriously nasty debugging. :-/

The only theory I have is that it’s got something to do with floating point accuracy. It’s the only thing I can think of that’ll cause numerical instability at higher numbers where there is none at lower.

I just checked the objects that were having this problem, in order to gather more details for this discussion so we could maybe figure out what’s causing it; but for some reason the problem isn’t showing up right now… although it had been showing up for weeks, every single time, up until this point. Now, even my own procedural shaders are producing smooth results (not the slightest bit of pixelation now) even at extreme closeup range, and the objects which had been showing facets are now showing smoothed normals even for my own shaders (except for one facet on two of the objects, but I think that’s caused by a geometry mistake I made at that one point in the objects). So now I have even less idea of what’s going on, since this is the first time all of these objects have been giving accurate results at those specific coordinate ranges.

**** beats head against the wall ****

I think it’s gremlins. Or at least it’s easier to maintain sanity by pretending it’s gremlins and moving on with life…