How to generate normal maps on the fly?

I have an application where I inherit normal maps from several layers, blurring the lower layers in a fairly complicated way. I’ve got all the image editing down - I can get a final result which is correctly stacked and blurred.

If I save the image to the hard drive and then manually import it as a normal map, I get the correct result. But I can’t do that for this application - I need a way to do it just in code.

I can’t figure out how to do it in code. When I assign this texture as the normal map on a material, it screws up. Even if the texture I use is entirely the standard blue (0.5, 0.5, 1, 1) it drops the normals from the mesh and just points everything off in another direction.

I’ve tried a few variations. For example, I tried importing a white image as a “convert from grayscale” normal map, then changing its pixels to my calculated height map at run time. That doesn’t work. I tried importing a proper normal map and overwriting its pixels at run time with my calculated normal map, but evidently Unity’s normal maps are in DXT5 format, and therefore you can’t set their pixels using the normal set pixel functions…

So, what’s the best way to do this? Do I need to convert the resulting image to DXT5? If so, how? Is there another way, a setting somewhere I can toggle? Some kind of function or setting I’m missing?

Ahh, I feel like an idiot. I specified it as (xdelta, ydelta, normal, alpha), but if I specify it as (alpha, xdelta, ydelta, normal) any texture can be set as a normal map and functions fine.

However, it doesn’t look anything like the normal maps generated by Unity. They’re pink. I think this is because Unity doesn’t properly carry the difference between formats RGBA32 and ARGB32, and is converting between them with a hard cast.

Either way, problem solved. Sorry to bother you.