Hello!
I recently created a system to generate Signed Distance Field textures for sprites in my game, in order to render outlines and other shape-based effects.
I’ve been experimenting with using DDX and DDY to convert the SDF into a vector field or flow map.
That all works well when testing on procedural shapes, but when using a baked SDF texture as input, I’m seeing a pattern of alternating black values that I don’t quite understand.
Check out this image to see what I mean:
Why does the texture input create these alternating pixel gaps rather than being a consistent (albeit pixelated) gradient as I was expecting?
The SDF texture is stored as Alpha8 format, which is a single channel uncompressed.
Thanks for any insights.
is texture filtering enabled?
my only real idea here is that either the uv/texture interpolation or the texture itself are causing lower precision than procedural would.
does removing the normalizing (or taking the abs of the normalized value) of ddxy fix it? maybe it’s normalizing to negative values?
Thanks for the reply.
Texture filtering is enabled, but without it the issue only gets worse.
Also without normalizing and just multiplying to view the DDX DDY results, the result is the same.
However I’m thinking that you actually may be correct with the precision / filtering hunch, only that the issue is baked into the texture. I’m thinking the method I’m using to generate the textures might be rounding nearby pixels to the same value, perhaps caused by the intermediate texture formats or something. That would definitely cause DDX and DDY to return 0 since there’s no difference. Seems promising, I’ll look into it. Thanks again.
1 Like