Problem with passing UV's to shader

My goal is to pass a texture from a Texture2DArray, _MainTexArray, to my shader to be rendered. To do this, I pass a ComputeBuffer containing the UV’s to the shader with SetBuffer() as well as the texture array through SetTexture(). Although the code renders, the resulting texture is a mono-color of what I am guessing is one of the pixels of my texture array and not the texture I am trying to render.

Right now I am kind of stumped. What I think is happening is that I am passing only 1 UV to the shader, so it only samples that one specific point from the texture. However, I do not know how I could pass the appropriate 4 UV’s to the shader (or even if my shader has been written correctly). bgolus please save me

Shader "Unlit/CustomShader"
{
        SubShader{
            Pass {
                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #include "UnityCG.cginc"

            struct appdata {
                float4 vertex   : POSITION;
                float3 uv : TEXCOORD0;
            };

            struct v2f {
                float4 vertex   : SV_POSITION;
                float3 uv :TEXCOORD0;
            };

            struct MeshProperties {
                float4x4 mat;
                float3 uv;
            };

            StructuredBuffer<MeshProperties> _Properties;
            UNITY_DECLARE_TEX2DARRAY(_MainTexArray);

            v2f vert(appdata i, uint instanceID: SV_InstanceID) {
                v2f o;
                float4 pos = mul(_Properties[instanceID].mat, i.vertex);
                o.vertex = UnityObjectToClipPos(pos);
                o.uv.xy = _Properties[instanceID].uv.xy;
                o.uv.z = _Properties[instanceID].uv.z;

                return o;
            }


            fixed4 frag(v2f i) : SV_Target
            {
                return UNITY_SAMPLE_TEX2DARRAY(_MainTexArray, i.uv);
            }

            ENDCG
        }
    }
}

This is the declaration for the MeshProperties Struct

private struct MeshProperties
    {
        private readonly Matrix4x4 _mat;
        private readonly Vector3 _texture;

        public MeshProperties(Matrix4x4 matrice, Vector3 textureUV)
        {
            _mat = matrice;
            _texture = textureUV;
        }

        public Matrix4x4 Mat { get { return _mat; } }
        public Vector3 TextureUV { get { return _texture; } }

        public static int Size()
        {
            return
                sizeof(float) * 4 * 4 +
                sizeof(float) * 3;
        }
    }

And this the initialization

            result[index] = new MeshProperties(
                Matrix4x4.TRS(new Vector3(xValue, Mathf.CeilToInt(Mathf.PerlinNoise((xValue + 0.1f) * .1f, (index + 0.1f) * .1f) * 10), index),
                Quaternion.Euler(90f, 0, 0),
                Vector3.one),

                new Vector3(1, 0, 0));

So I found what was wrong, but I also ran into a new issue.

For the original problem, I don’t really understand it, but it seems as though the shader does not require UV’s to be passed to it but instead calculates UV’s based on the mesh supplied to it and the position of that mesh. Accordingly the shader should read:

v2f vert(appdata i, uint instanceID: SV_InstanceID) {
                v2f o;
                float4 pos = mul(_Properties[instanceID].mat, i.vertex);
                o.vertex = UnityObjectToClipPos(pos);
                o.uv.xy = i.vertex.xy;
                o.uv.z = _Properties[instanceID].index;

                return o;
            }

Where the main script now only provides the index for the texture array instead of a UV. However while we do get textures now, we also get stretching on the sides of the cube:

I am guessing this is because the UV is defined only in the XY plane. However I am unsure how to resolve this. Changing to a surface seems like an option, but it seems like it would be easier just to somehow provide the sampler the XZ and ZY UV’s.

Fixed. The issue goes back to my misunderstanding of how UV’s are passed to the shader. The shader receives the UV’s along with the Mesh when the Graphics.DrawMeshInstanced method is called. There is no need (at least for this use case, where the orientation of the texture is not an issue) to provide UV’s to the shader through other means. The correct code looks like this:

v2f vert(appdata i, uint instanceID: SV_InstanceID) {
                v2f o;
                float4 pos = mul(_Properties[instanceID].mat, i.vertex);
                o.vertex = UnityObjectToClipPos(pos);
                o.uv.xy = i.uv.xy;
                o.uv.z = _Properties[instanceID].index;
                return o;
            }