Matching "conveyor belt" movement speed with UV animation speed

I’m experimenting with some prototype concepts and right now I’m building a very simple conveyor belt.

Here are the components I have:

  • A plane with a box collider

  • A C# script which uses OnCollisionStay to move any objects which have landed on the conveyor belt

  • A textured material with a basic shader that uses _Time to animate the UV of the conveyor belt texture

So as you can see, the conveyor belt is just a flat surface, and the shader creates the illusion of a moving tread surface.

My script has a speed variable and right now I am essentially just updating object transform positions by a factor of Time.deltaTime * speed in the Forward direction of the conveyor belt.

My shader’s fragment method calculates the pixel value with the following line:
fixed4 col = tex2D(_MainTex, i.uv + float2(0, _Time.y))
This works surprisingly well for so little work. However, I have one glaring “bug”. The UV animation of the material moves at a different speed than the objects that are standing on it. Obviously the values I have currently are rather arbitrary, but I’m trying to figure out how to make sure the image moves at the same speed as the objects in space.

Is there a way to synergize the “movement” speed in my C# script with the “animation” speed in the shader? I’ve looked at the Unity documentation for _Time. I tried replacing it with unity_DeltaTime but that ended up not doing anything (the image wouldn’t move at all). I’m aware of methods such as Material.SetFloat, but it’s a question of what to pass in, if I were to use it.

I understand that _Time.y should be equivalent to Time.timeSinceLevelLoad, but seeing as how that’s an ever-increasing value, I don’t think I can use it in my movement speed calculations.

Is this doable at all, or would I need a different animation approach entirely, one that runs on the main CPU thread?

P.S. As this is primarily just a personal exercise for my own edification, I specifically would like an answer that involves shader code, not Shader Graph – unless it’s not possible in shader code at all :slight_smile:

Caveat that I’m a relative newbie in the world of shaders but I hope I can offer something helpful at least:

First:

I’m nearly certain that shader code can do everything shader graph can, and more, with the latter having only a subset of the capabilities of the former.

As for your actual problem,

Your problem stems from the fact that, inside your shader, you don’t know how big your object is, and therefore you don’t have a metric to map how far one unit in UV world is compared to one unit in Unity’s coordinate system. That makes it difficult to animate texels at the same speed as some object moving in the real world. So maybe what you need is to pass in the Unity-World size of your belt (is it a single square tile of a belt?). Then you can do a calculation like:

i.uv + float2(0, (_Time.y * MovementSpeed) / RealWorldSize)

My brain might not be working correctly right now, so I’m not sure that calculation is correct, but the gist is that you need to factor in the size of the object somehow in order to properly convert from Unity world space coordinates to UV coordinates.

I get what you’re saying. Shortly after posting I started to mentally go down the same path.

As for figuring out the “RealWorldSize” there are a number of factors to take into account:

  • actual length of the conveyor object in scene: GetComponent().bounds.size.z

  • the conveyor’s transform scale (affects the above, obviously)

  • the physical size of the material’s texture: in my case, it’s 256x256. probably not relevant for movement calculations but could be useful in the shader

  • probably others

It seems like the biggest hurdle to getting a dynamic and reliable solution would probably be scaling. If I wanted to have one conveyor belt that’s longer than the other and I achieve this by stretching the transform’s scale, that would screw with whatever space-time formula I had.

For what it’s worth, I’ve discovered how the same animation effect can be achieved entirely in the C# script by just incrementing the Y value of the offset vector on the material. I’m sure it’s less performant than doing it all in the shader, but while I’m just trying to figure out the magical formula (if one exists) this does make things a little easier from an experimentation standpoint. And by playing with the Tiling vector (Material.mainTextureScale) I’ve at least got a solution to making the texture morph and tile based on the plane transform’s scale… but I digress.

I think there is just no way to convert UV scroll speed to world movement speed or vice versa - and if there is, it probably won’t scale in a clean/predictable way with mesh sizes.

So, if somebody were making a conveyor belt in their Unity game, they either put together a more advanced solution for animating a mesh in the scene, or if they go the scrolling route perhaps they just don’t worry about this discrepancy.

Aaaaaand I just found a solution, after all. Turns out I guess my problem had to do with HOW I was moving the objects that land on the conveyor belt.

Even though I knew that ultimately the “correct” movement would involve using physics, I was simply moving objects in OnCollisionStay by directly updating the transform’s local position in the conveyor’s forward direction.

Well, evidently is this not only a poor way of moving objects in a scene (especially if you can assume they’ll be physics based), it actually makes it extremely hard to match up simple transform translation with UV scrolling.

Switching to Rigidbody operations for the object movement made everything magically “Just Work”.

I found this Rigidbody technique for moving objects on the conveyor belt mentioned by multiple people online, while the visual animation component kept getting handwaved or ignored as if nobody online has had to solve this puzzle… turns out they didn’t.

Here’s the entirety of my code:

using UnityEngine;

public class ConveyorPhysics : MonoBehaviour
{
    public float speed = 2.0f;

    private Material material;
    private Rigidbody rigidbody;
    private Vector2 uvOffset = Vector2.zero;

    private void Start()
    {
        material = GetComponent<MeshRenderer>().material;
        rigidbody = GetComponent<Rigidbody>();

        // dynamically sets the "tiling" of the material texture based on transform stretching
        // important for the scroll speed to look correct
        if (transform.localScale.x == 0.0f) return;
        float scaleRatio = transform.localScale.z / transform.localScale.x;
        material.SetTextureScale("_MainTex", new Vector2(1, scaleRatio));
    }

    void FixedUpdate()
    {
        // this Rigidbody technique results in the object not moving but objects touching it react as though it did
        rigidbody.position -= transform.forward * speed * Time.deltaTime;
        rigidbody.MovePosition(rigidbody.position + transform.forward * speed * Time.deltaTime);

        // scroll the texture
        uvOffset += new Vector2(0, speed * Time.deltaTime);
        if (uvOffset.y > 1.0f)
        {
            // keep the "V" value between 0 and 1 using Math.Truncate. same idea as using frac() in the shader code
            uvOffset.y -= (float) System.Math.Truncate(uvOffset.y);
        }
        material.SetTextureOffset("_MainTex", uvOffset);
   }
}

As you can see I’m not doing any crazy math to get the animation of the UV to equal some magic calculated speed… I’m using the exact same “Forward * speed * Time.deltaTime” calculation that the physics is using to move objects, but unlike with moving objects by translating transform.position, THIS JUST WORKS.

Credit to this GamaSutra post by Mark Hogan and this Youtube video by Shahzod Boyxonov for the Rigidbody formula.

2 Likes