Hi community,
I currently work on a GPU based fluid simulation based on the article from the GPU Gems: Fast Fluid Dynamics Simulation on the GPU, http://http.developer.nvidia.com/GPUGems/gpugems_ch38.html.
It seems that the RenderTextureFormat.ARGBHalf format is not accurate enough for at least some steps (I can notice a bit of aliasing when i output the pressure texture, for example). The simulation works quite good, but i have problems when it comes down to maintain a consistant viscosity at different render target sizes. Since i’m not completely sure whether it is a mistake somewhere in my code or a precision issue:
Is there a way to force higher precision render targets (like D3DFMT_A32B32G32R32F, modern graphics cards are perfectly capable of it) to at least proof that the artifacts are caused by too low precision? Or does anybody know a way how to pack a float into two halfs to work around that limitation?
edit: Already tried RenderTexture.DefaultHDR which causes “Unknown render texture format” errors.