(Bug?) WebGL (large scaled) object are lit differently

Hi,

Perhaps there’s a missing normalization step or multiply somewhere as in editor vs webGL build, a cube floor stretched to say 50-100 units will be lit quite differently on WebGL vs editor.

I wondered if it was a limitation of WebGL but it just seems like a simple oversight.

Hi!
My guess:
https://github.com/Unity-Technologies/ScriptableRenderPipeline/blob/master/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl
For WebGL there’s #define SHADER_HINT_NICE_QUALITY 0, which moves normalization to be done per-vertex instead of per-pixel.

Aha! Thank you :slight_smile: Makes perfect sense now. But why do we do that when mostly, gpu performance is generally quite decent on WebGL?

Also, it’s not accessible in Shadergraph along with a bunch of other things like stencil ops and so on, would be pretty cool actually to be able to specify a header file a graph can use like custom node, to set those values without having to edit source each time.

Any plans?

That’s when you run on desktops, right? :slight_smile:

I’ll poke the shader graph team about that

True! desktop development here… And thanks for poking them :slight_smile: