How to force fragment shader to interpolate value calculated in vertex shader

I created a low-poly water shader using some noise and time animation and wish to re-use the same noise to affect the colours (ambient and emission).
The hitch: I don’t want those colours per-pixel (fragment) but across a whole plane of a given triangle, i.e. interpolated.

In a hand-written shader I’d simply output colour value from vert code and read and assign that in surface/fragment so it interpolated the value between vertices.
This would also be more efficient (since I’m targeting mobile).

In shadergraph I took a link from the section of the graph that’s generating etc the noise to a swizzle and some lerps between colours. Instead of doing what I’d hope (same as what I’d have hand coded), the generated code recalculates the same noise in the fragment first! (visible as varying colour within my low-poly triangles :frowning: )

So, how can we ‘hint’ to the shader to do this sort of thing?

I’m using 2019.3.0b7 with URP & ShaderGraph 7.1.2 (all the latest stuff I can AFAIK).
Thanks!

2 Likes

afaik all you can write to per vertex are position, tangent and normal. all other calculations are done per fragment :frowning:

So, am I right that there is no chance to get optimization by moving calculation to Vertex Stage?
Only by writing shader manually?

I think, there is enough information to understand, what we can do on Vertex Stage and what we need to calculate only on Fragment Stage. For example this shader can be totally done in Vertex. Fragment shader should only return calculated color of vertex.

Is there any plan to make an algoritm to do that optimisations? Or may be we can use some Vertex Node, which will describe, that all calculations before should be in Vertex Stage?

For now all calculations in the Fragment Stage :frowning:

afaik you can only write to position, tangent and normal. everything else will be evaluated per pixel.