Shader Graph - Getting local sprite UV from sprite sheet

,

When making a shader in shader graph that is being applied to a sprite sheet texture, I often want to do things with the UV. The problem is that the UV coords (0-1) are for the entire sprite sheet as opposed to just the local uv coords for the current sprite that is being displayed.

Example:
Create a shader that simply clips the sprite at the halfway point (only displaying half of it). I would simply just take the UV than add a step node and multiply that by the _MainTex alpha channel.

Does anyone know if Shader Graph (or unity 2019 in general) has a built in function/node for this?
I know a common method is to just calculate this yourself giving the sprite rect information then passing that to the shader via properties. I was hoping maybe i wouldn’t have to do this every time i wanted a shader with local UV logic applied to sprite sheet textures since it can get tedious.

Thanks!

1 Like

I had similar problem when doing this: http://sbcgames.io/sword-fortune-wip-waving-foliage-shader/
In the end I encoded needed information into secondary texture (in my case into red and green channels).
You can encode there distance from bottom in range 0-1 (0 bottom of your sprite and 1 for top). Then you can sample this texture at the same UV as for main texture to get how high is current fragment above bottom.

Interesting, so for the secondary texture do you generate this via a c# script on runtime or is this something that is done manually?

It is generated by helper C# script, but not in runtime. With helper script I create texture which I import as asset and use it as secondary texture for sprite. It is ready to work with atlases/spritesheets. Unfortunately, atlasing of secondary textures is not supported by Unity yet - they plan it for 2020.2 ( 2019.2.0b1 Secondary Textures + SpriteAtlas )

To create and save texture I use something like this (with example code below I created texture, that encodes sin/cos values for fast reading in fragment shader - just see, how to access pixels and create whatever you need):

    public void CreateSinCosTexture() {

        int width = 1024;

        Texture2D tex = new Texture2D(width, 1, TextureFormat.RGBA32, false);

        for (int i = 0; i < width; i++) {

            float angle = (float)i / width * 360;
            float rad = angle * Mathf.Deg2Rad;

            //Debug.Log($"{i}: angle = {angle}, rad = {rad}");

            float sin = Mathf.Sin(rad);
            float cos = Mathf.Cos(rad);

            sin = (sin + 1) / 2;
            cos = (cos + 1) / 2;

            Color32 color = new Color32((byte) (Mathf.RoundToInt(sin * 255) & 0xFF), (byte) (Mathf.RoundToInt(cos * 255) & 0xFF), 0, 255);

            tex.SetPixel(i, 0, color);
        }

        // Encode texture into PNG
        byte[] bytes = tex.EncodeToPNG();
        DestroyImmediate(tex);

        // write to a file
        File.WriteAllBytes("D:" + Path.DirectorySeparatorChar + "SinCos.png", bytes);
    }
2 Likes

If you know your sprite size und count (i.e. via a script or hardcoded), you should be able to simply calculate the UV coordinates for the corresponding sprites. You can calculate these in the awake function or something like that and there won’t be much overhead at runtime.

I did this for a rain animation that was fixed to the Camera. I made a shadergraph with a UI shader, so if you’re using a sprite shader, it might be different for you.

For example if you have 10x10 square sprites with the same size, the first sprite will be x(0-0.1),y(0-0.1) and so on. With different sized textures you’ll have to hardcode a bit more stuff, but it should be possible.

Although it seems like a lot of work to get the pipeline going, the secondary texture solution looks nicer, though you’d need to create some tool to create those secondary textures imo. Otherwise, hardcoding is probably less work.

2 Likes

Thanks for posting the code i understand now how you are doing this!
Interesting solution that looks like it can be pretty easily integrated into an automatic sprite importer or however you do it.

Definitely will look into this!

Yeah I saw that the Shader Graph actually has a node called “Flipbook” that i think does what you are saying (splits spritesheet into even segments given dimensions) My main problem with this is that the sprite sheets i use most likely wont be exact tiles (different dimensions). But that would be a good solution for tile based sprites.

Thanks for the reply!

I’m not 100% sure what your issue is at this point.
Are you trying to apply algorithms within the shader that would assume a UV 0-1 and using fractions would be a problem? Or is it something different?

You could, for example, add a script that contains the texture and information about each sprite within it. The information would be the min-max range of every single sprite. If you input this range to the shader, it can normalize the values to be 0-1 and then correctly apply the algorithms.
Just an idea.

Yeah pretty much what I am trying to do.

I’ve done this solution before with a script that just feeds the shader the current sprite texture information. It certainly isn’t hard and its manageable. (Tom-Atom’s solution is also very elegant!)

I was just wondering if this was already a built in feature as it doesn’t really make sense to me why this isn’t already a supported feature in the SpriteRenderer component as I would imagine the sprite renderer already has the necessary information to do this.

I appreciate the reply!

Problems I see here are:

  • if you use atlas, you never know, where will be sprite placed. You would have to either use your atlas solution or somehow hook to Unity to check when atlas is rebuilt and then update your sprite metadata,
  • inputing sprite metadata into shader would probably lead to breaking batches - if you pass it as uniforms on per sprite basis. If you do not want to break batch, then you have to pass it as per vertex data, but there is problem, that SpriteRenderar does not allow you to pass additional data. You can only do “hack” and encode some additional vertex extra data into rotation X, Y or position Z, because we are in 2D. You of course have to clear this in vertex shader. Secondary texture is another “hack” - you can pass per fragment data with it and more, it is fast as there is no extra calculation, but only texture lookup.

I think, that SpriteRenderer has no information about position within texture other then UV on vertices. Problem is, that in vertex shader, each vertex is processed individually. And you do not know, whether it is vertex on top or at bottom or in the middle. You cannot ask for sprite’s topmost vertex, etc. And vertex shder does not even need it. So, if you wanted access some sprite “bounds” in vertex shader, you can either pass it as uniform … but it will break batch or pass it as additional per vertex data (with problems described above) - then the bounds are included into each vertex and you can access it in vertex shader.
Even if you passed extra vertex data into vertex shader, there is currently problem, that Shader Graph does not support custom varyings (interpolators for passing information from vertex into fragment shader). But you can use them if writing shader in code. It is big difference, if you calcualte something in vertex shader or in fragment shader. If you sprite is two triangles, then it vertex shader has to process 6 vertices and can pass interpolated result into fragment shader. If you do the same calculation on fragment shader, then for sprite 100x100 pixels, the same calculation can be repeated 10000 times.

@Tom-Atom I’m guessing you’re talking about this component?`

I’ve never used it before and I always do the atlas packing myself, since I’ve never had too optimize it too much to need it to be automated. If you’re using that component my solution won’t work, you are correct.

Though I do believe there are programs that do the sprite packing for you and give you back an atlas in the form of a normal, compressed image (i.e. png, jpg,…), allowing you to still know exactly where and how to access the UVs of single sprites.

As I said, your solution is nicer but more work to setup. Your points on batches breaking, etc. are probably true. It depends on the problem and how much work one is willing to input - “Premature Optimization is the root of all evil”

1 Like

I guess the reason why thought the sprite renderer should have this information is just because how else would it know which sprite to renderer with an entire sprite sheet? I assume theres a way behind the scenes the sprite renderer only displays a segment of the sprite sheet (since like you said the shader doesnt have access to this). I have no idea how this works btw so im just shooting out hypotheticals.

Still sucks though because this seems like thing that everyone working with sprite sheets and shaders would run into would they not? I guess everyone just does their own “hacky” fixes to get this work. Which I think leads itself to the question that is,if everyone is doing their own solutions to get this to work why not have this at least explored as a feature (in the form of a component/engine code etc).

Thanks for your detailed reply it was informative.

Each sprite is smiliar to 3D mesh, but it is flat. Under similarity I mean: it is some mesh that has position of each vertex in object space and also uv coordinate in texture. These are fixed and I believe it is Unity’s job to (re)calculate this when you decide to use atlas. Into game it already comes calculated. (I will be happy if someone with better insight will correct or adjust this)

Anyway, you can read both model vertex positions and uvs through code. See Sprite: https://docs.unity3d.com/ScriptReference/Sprite.html there are vertices, uv and triangles. These are passed to shader. Shader does not need anything else than UV to find position in texture. As vertices in shader are processed individually, without possibility to look on any other vertex, you do not know, whether vertex with UV.y = 0.2456 is in top, bottom or middle of sprite - it is just one vertex of one of triangles that form whole mesh.
If you want to have some information, like top of sprite in atlas, then you have on C# side iterate through Sprite.uv and pass this information into shader as uniform. Such information is available to all vertices in vertex shader and to all fragments in fragment shader. But, you have to change it for each sprite and it breaks batch … 1 sprite = 1 draw call.
Or do some “hack” and pass this information along with position of vertex and uv for vertex. Then this information is passed for every vertex (so, if you have 20 vertices, it is passed 20 times), but that individual vertex, that does not know anything about other vertices can use it. So, main problem is, that you can’t simply add additional per vertex data with SpriteRenderer (which you can do with MeshRenderer)

Yeah Unity has these kinds of things, especially when it comes to 2D, sadly… The irony is that Unity is said to be the best 2D engine out there. For a new project I might consider making everything 3D, with an orthographic camera an using plances with textures instead of sprites you would get the same look and feel, but maybe more Unity features to work with…
I’ve gone through a lot of issues with things that would seem obvious, but they are at least doing frequent engine updates, so maybe there will be a point in time where we won’t have to do workarounds for everything…

Oh i understand now why it easiest as i thought for the sprite renderer to just make this work. Thanks its always useful to learn more about how the sprites work under the hood so i can understand their limitations better

I wonder if we can request feature for sending original sprite UVs as second set of texture coordinates. This will help a lot of shaders that have to work with atlas packed sprites. I, pretty much, always hit my head into this when writing shaders.
Secondary texture with UV lookups will work of course (when they make atlases for secondary textures), but it will have worse performance in many ways.

Just ran into this exact problem too.
I have sprites in an atlas and a shader needs to know the per-sprite top and bottom uv’s.

Having this information passed by the SpriteRenderer Component as TEXCOORD3 or something would save so much work and make everything much cleaner.

I don’t understand how _MainTex (Sprite in an Atlas) is getting renderd correctly because it shares the same UV cords as my second texture I want to use in the Shader Graph.

So the information about the cropped UV space must be in the render pipeline somewhere. It would be awesome to get access to that cropped UV space… it’s a problem I struggling for days now :confused:

3 Likes

Bump. Shader Graphs and Sprite Atlases are both great tools. I wish that Unity devs would make some design changes to make it easier for these things to work together.

The relevant entry in the official Unity Issue Tracker dismisses the problem, saying that all is working as intended. That this situation is “By Design.” I find this a little disappointing. It should be on a to-do list somewhere, even if that is only a distant wishlist: Unity Issue Tracker - Shader Graph UV node uses whole Sprite Atlas UVs instead of individual Sprite when entering the Play Mode

6 Likes

right now we had to remove all our shaders to increase the performance of our game with Atlas, now turns out that we need to choose between Atlas or shaders because Unity does not consider that we may want to use it together, dismiss such an important thing as “it is by design” is like having a bug in your game and discarding it as “it is a feature”, I’m baffled that no one is looking into this at all

1 Like