I tried this by feeding a cubemap from a camera into a shader, which I then fed into another camera with Graphics.Blit.
This worked perfectly, except that the way I got the fisheye effect was using a texture generated in Maya with a sphere of reflected normals, so I could get almost a full 360 degree view. (Basically a mathematically perfect, reflecting sphere with no fresnel reflecting another sphere encapsulating it colored by it’s normalized coordinates in space).
It seems the precision of 8bit normals isn’t enough (duh…), so my question is if anyone has any suggestion on how to generate the normals within Unity, or have a way of importing 16bit (or even 32bit) textures, maybe through hacks or something?
For generating, I have played with this formula:
x = sin(theta)cos(phi)
y = sin(theta)sin(phi)
z = cos(theta)
With theta being the V-coordinate, and phi being the U-coordinate, either mulitplied with pi or 2x pi, or switched around (spent 4-5 hours on that yesterday).
It turned out too angular though, and I don’t know if it’s just my poor math, or I need a different approach.
Too angular? Do you mean that the normals aren’t smoothly distributed? (I can imagine something ‘angular’ happening if you calculate the normals per-vertex, not per fragment) Or something else?
Perhaps you should show an example of what goes wrong there too. Because I do think that calculating your normals in the shader probably gives the prettiest results.
On a related note, you might be interested in Aras’s blogpost about encoding/decoding normal maps for storage in 8bit/channel texture.
In all honesty I don’t really understand the formula. I get that it somewhat treats the u and v as axial and polar (right words?) angles to find the position, but I don’t understand how to tweak it, so it might be what I am looking for, just set up wrong.
Also, reading through Aras’ blog now, VERY interesting. Probably hard to use for this case though.