looking through some examples, mainly this one by INedelcu:
from what I understand, it works by dispatching a RayTracingShader that works in a scene separate from what the traditional rasterization shows. To create any results from ray intersections, I need to apply a custom RT only shader to every mesh in the scene.
The issue with this is … I cant replace every shader and material in the project with a custom written shader that has a rasterized and raytraced version.
What can I do to get a result from the rasterized scene? To get an acurate diffuse color from a ray traced efect, I would need to not only rewrite the standard shader, but doesn’t it have to run twice for texturing, light and shadowing? Without those calculations I have no idea if the color I am tracing is fully dark or fully bright.
I guess it makes sense that it needs to re-rasterize everything since it can encounter points that werent rasterized from the camera viewport, but it’s still unfathomable that I can’t use any of the existing shaders when the RT shader needs a rasterized fragment.
Built in render pipeline or the Standard shader doesn’t implement any ray tracing effects. You can check how a simple effect in HDRP is implemented, for example ray tracing AO. You’ll see a Shader Pass in Lit.shader and other shaders that’s named VisibilityDXR and is used HDRenderPipeline.RaytracingAmbientOcclusion.cs to generate AO texture. This texture will be later combined with the rasterized scene.
In general, the entry point in a ray tracing effect is the ray generation shader in the RayTracingShader (.raytrace file). This ray generation shader will cast primary rays into the scene or camera. The rays intersect different geometries in the acceleration structure that have different Materials. When a hit is detected, the GPU executes the code (the hit shaders) in the Shader Pass that you previously specified in C#. If your Shaders don’t have this Shader Pass then nothing happens. The ray generation shader and hit shaders communicate data through the ray payload. It’s the closest hit or miss shader that writes the final result in the ray payload. When the control returns to the ray generation shader (e.g. after a TraceRay call finishes execution) you can write the result in the ray payload into a render texture that can be later combined with the rasterized scene or other lighting computations.
When working with color, I want the hit to return the color of the surface that I hit. When a hit is detected, the hit shader executes and gives me that color value. But do I have to manually write my own color generation in that hit shader?
For example if I am tracing a scene with a single cube that has a texture on it. Do I need to replicate that UV mapping from surface shader in the raytrace hit shader? With tiling and everything? Or is there a way for me to generate that fragment from surface shader as if it was rasterized?
Because if not, that would mean that I cannot write any raytraced effects that work with other peoples shaders. All the shaders would need to include my raytracing shader pass for me to get any color from them.
You can’t make a ray tracing effect be compatible with any shaders. There’s no way to automatically generate a hit shader out of a surface shader. Hit shaders need to communicate data back to the ray generation shader using the ray payload.
That kinda means that I can’t create anything that works with previously created unity content though, right?
Like, the only way is to create all shaders twice, once for fragment, once for tracing and to limit other developers to only use predefined shaders. None of the assets on assetstore that contain any custom shaders will work with raytracing, since they don’t include Hit shaders and if I want to use my assets, I would need to write my own hit shaders for everything?
Yes, correct. In HDRP a lot of code is shared between hit shaders and fragment shaders, though some HLSL operations are not defined in ray tracing like ddx, ddy, automatic mip level calculation (we need to specify the mip level when sampling a texture in hit shaders) but this is how hit shaders work. You can check Microsoft’s DXR specifications here if you are interested how it works at low level.