Raytracing API Out of Experimental in 2023.1

We are happy to announce that as of Unity 2023.1.0a13, the Ray Tracing API is officially out of experimental. This change is introduced after recent improvements to the Ray Tracing API, ranging from stability and performance to additional compatibility with the engine’s existing feature set.

Unity’s Ray Tracing API is utilized by the High Definition Render Pipeline, in order to implement a collection of cutting-edge raytracing effects such as:

  • Ambient Occlusion
  • Contact Shadow
  • Global Illumination
  • Reflections
  • Shadows
  • Subsurface Scattering
  • Recursive Ray Tracing
  • Path Tracing



“Enemies” - HDRP Ray Tracing in Unity 2022.2

In order to experiment with HDRP’s raytracing effects, you can now utilize the HDRP Sample Scene which has been updated to provide new raytracing quality settings. The new settings assets are introduced for the 2022.2 beta in “3D Sample Scene (HDRP) v14.1.0”, and for the 2023.1 alpha in “3D Sample Scene (HDRDP) v15.1.0”:



“HDRP Sample Scene” - Raytraced GI, Reflections, Shadows and AO enabled

As of 2022.2, we now provide full desktop and console support for RT capable hardware including Xbox Series. In the same release, we also implemented Terrain Heightmap compatibility with raytracing when utilizing the Terrain system.

Unity 2023.1 further advances Unity’s Ray Tracing support by fully integrating with VFX Graph in order to allow authoring complex particle effects that are compatible with HDRP’s Ray Tracing effects.

This release also expands the Ray Tracing API in order to simplify the configuration of meshes added to the Ray Tracing Acceleration Structure (RTAS). This is achieved by introducing a new overload for RaytracingAccelerationStructure.AddInstance”:

RaytracingAceelerationStructure.AddInstance(ref Rendering.RayTracingMeshInstanceConfig config, Matrix4x4 matrix, Nullable<Matrix4x4> prevMatrix = null, uint id);

The new API allows to pass the new “RaytracingMeshInstanceConfig” struct in order to conveniently configure the mesh and material parameters of instances to be included in the RTAS. For example, using this new API it is now straightforward to process/animate the geometry of ray-traced meshes by retrieving the mesh vertex buffer using “Mesh.GetVertexBuffer” and binding it to a compute shader using “ComputeShader.SetBuffer”.

Update:

RayTracingAccelerationStructure.AddInstances is also introduced in 2023.1a18, and provides full instancing support to the Ray Tracing API, allowing to add large amounts of mesh instances to the RTAS, and to use the instance ID in hit shaders for accessing per-instance data.

Using instancing, it is now possible to more efficiently ray-trace large and dense scenes that include high-frequency repeating meshes and detail. For more information on Raytracing Instancing, including performance testing figures and a reference sample project, please check out the following slides.

With the Ray Tracing API out of experimental in 2023.1, we can’t wait to see the amazing results you achieve using HDRP’s comprehensive ray tracing effects! Your feedback is crucial, so please let us know if you encounter any issues, and share any features and changes you would like to see. You can contact us directly in this thread, or by submitting a feature/change request using the official Roadmap Request Form.

12 Likes

Great to hear that!

Tree and foliage wind/touch bending is most of the time achieved by using vertex animation in shaders. Do you think that it will be possible to feed the RTAS with vertex animation from shaders in the future? Especially when it comes to instancing when one source mesh is instanced multiple times.

We don’t think that is achievable. This is because each tree instance must write the animated result into GPU memory and build the acceleration on the GPU for that tree instance each frame, assuming the wind animation doesn’t look the same for all trees. This means that instead of having only one Mesh for a tree prototype in GPU memory, we must have one Mesh for each tree instance + its associated acceleration structure. This is in addition to associated GPU cost for writing these animated meshes in GPU memory and build their acceleration structures. The ray tracing pipeline is a different from the rasterization pipeline and they use different types of shaders (vertex shader, pixel shader versus ray generation shader, hit shaders). If a vertex shader does vertex animation, this animation is not automatically captured in GPU memory unless you write it manually from the vertex shader or a compute shader. SkinnedMeshRenderer component is an example that employs this type of pipeline for example when GPU skinning is enabled.

3 Likes

Thank you for the detailed response.

So, if this is a general raytracing limitation as we can’t simply and cheaply feed the RTAS with the results from vertex shader I’m wondering what’s the future (if there is any) for making many instances of animated foliage and raytracing work together?

To achieve this, a compute pre-pass that processes the deformed geometry would currently be needed. With the recent API changes mentioned in the post, you can now conveniently deform the vertices of a mesh added to the RTAS via RTAS.AddInstance(Mesh), for example using a compute shader.

Before enabling wind animation to the existing Trees/Detail painting system when using Raytracing, we are still exploring sensible approaches to avoid excessive impact on performance and stability (e.g due to exceedingly high memory usage), so RT support would currently be limited to static trees/foliage.

1 Like

Honestly, we’re artists, not programmers.
We want a simple solution to replace traditional offline rendering, with unity offline rendering.

All this talk of RTAS.AddInstances and this code and that code, makes our heads spin. Why can’t we as artists get something that just works?

2 Likes

So to get/use this in 2023.1.0a14, all I have to do is start a new project using the HDRP demo scene, then convert the scene to Raytracing using the HDRP wizard?

When 2023.1 tech release launches, can we please get a new demo scene where it already starts up in DX12 with raytracing / pathtracing already enabled please? It makes sense to me…

Better yet, can we start getting the Unity demos to see how things were created and play with it ourselves. For example that atrium in the Enemies demo looks ace!

1 Like

Please add support for ComputeShader.SetBufferArray. As mentioned we have Mesh.GetVertexBuffer and use that directly in compute shader, but a mesh may contain multiple vertex buffers, it’s been supported by ray tracing shader internally in Unity

ByteAddressBuffer unity_MeshVertexBuffers_RT[kMaxVertexStreams];
ByteAddressBuffer unity_MeshIndexBuffer_RT;

If we have the same capability in compute shader, we can make those codes capable of both compute and ray tracing which simplifies things.

2 Likes

In order to get started with HDRP Raytracing, you can now:

  • Create a project using HDRP sample scene
  • Under “Project Settings”/“Quality” set RenderPipelineAsset to either ‘HDRPRaytracingQuality.asset’ or ‘HDRPRaytracingRTGIQuality.asset’
  • Control the desired RT effect in the inspector via Volume Override settings. In the scene hierarchy, look for “Lighting”/“Volumes”/“Volume Global”/“Volume Ray Tracing”.
  • You can also add your own volume overrides. For example, you can add a Screenspace Reflections override and set the Tracing method to “Raytracing”.

Thank you for the feedback! I agree that a C# API for ComputeShader/Material.SetBufferArray would be useful.
I believe this is currently unsupported as it would be limited to more modern platforms. I logged a request to our team, so will follow up on this and keep you updated.

I see no such settings or assets
You sure its in 2023.1.0a14?

8530853--1138802--upload_2022-10-21_18-30-32.png

The new Raytracing Quality Assets were added for the latest 2022.2 beta in “HDRP Sample Scene v14.1.0”, and for the 2023.1 alpha in “HDRP Sample Scene v15.1.0”.

It seems like there was a short delay with the latter (template update for 2023.1a), while these are already available for 2022.2b.

Edit: HDRP Sample Scene v15.1.0 should now be available!

1 Like

I wish there would be a chance to run the raytracing api/shaders on metal/m1 macs. Its nice to use unity with metal from the day one, but not getting raytracing API support(despite metal having one) is sad and affects our workflow between different team members that use different devices (one having mac/metal and other having win/dx12).

At least having the raytracing shader support, even if hdrp raytracing pipeline not working, is a good start.

Any chance this will work with DOTS, or is it a gameobjects feature only?

You might need to manage the instances manually instead of letting CullInstances() do it, but the new API linked in the OP doesn’t require a Renderer, just a Mesh and a Material. You could ping hybrid renderer team to see if they want to integrate it, or just do it yourself.

1 Like

DOTS does not officially support Raytracing at the moment, but the team will consider supporting this in the future.

As @burningmime mentioned above, you cannot currently use RaytracingAccelerationStructure.CullInstances() to add such instances to the Raytracing Acceleration Structure efficiently. It could be possible to get a custom implementation working via RaytracingAccelerationStructure.AddInstance now, though we did not fully verify this may not recommended this due to performance concerns.

Hi everyone!!!
I’m playing a bit with Unity 2023 HDRP and raytracing and I’m very confused.

I’m trying to make a room dark because there’s no sunlight coming in…
But I can’t get it. The ambient lighting of PBS always stays as you can see in the following video:

I have touched all the parameters of “SSGI” but I can not eliminate this annoying problem. I also have a reflection probe updating in real time… but it doesn’t do anything either.

I’m confused. I have tried to follow this guide: HDRP DXR/Realtime Ray Tracing Lighting Troubleshooting Guide
and other Unity videos like this
https://www.youtube.com/watch?v=F3hIculYFwM
.
But I don’t know how to solve these problems.

Is Unity not able to handle this type of scenario without having to change all kinds of settings in the PBS, intensity of directionalLight, or patching settings with “Indirect Lighting Controller or Exposure”?

How can I handle this type of situation? Any info, help, etc??

Cheers

Hey, you should probably have made your own post since it looks like a specific setup issue, but anyway.
Could you make the video “not private” and / or provide a repro project so that we can have a look ?

Recently we’ve introduced ambient light dimmer in RayTraced GI, Ray Traced Reflection and Recursive Rendering to take care exactly of this issue (basically ambient lighting falling back to ambient probe after the last bounce), did you try that ? (it’s an advanced parameter so you will need to click on the ellipsis, three vertical dots on the top right corner of the override, and click on show additional properties)

Agreed, this is a Unity Editor user experience / tooling issue, not a core technical feature issue. It’s already working, just without an Artist or rapid Prototyping friendly UI/UX on top. (Very valuable to developers not just artists.) so, less of a question for @INedelcu and more of a question for @LaurentGibert or at least he’ll know what team already is or will be managing this.

I suspect this hasn’t gotten a lot of attention because many Artists don’t know exactly what to ask for other than to just make their existing workflows keep working. Which is reasonable to ask and I don’t blame the artists for this problem we the engineers and programmers made, but in this case (making vertex offset work for raytracing) would limit accessibility and flexibility for future features. so as a developer / programmer that works with artists i’ll tell you what would be the best of both worlds:

A “compute” (shader) graph and perhaps a domain specific “mesh” graph variant (that may take advantage of mesh shader gpu hardware when available and will fallback to pure gpu compute when that is not available or DOTS style cpu compute as another fallback or as desired.)

In practice this would work similar in many ways to how Blender or Houdini Geometry Node graph tools or “modifiers.”

This would separate the Mesh abstraction from the Material and lighting Abstraction back to where they belong (in most cases, I know many exceptions exist.)

You have a Mesh Object, you deform it, manipulate it, generate it from scratch, whatever you want using “Mesh” compute nodes / functions in the compute graph. This can either update dynamically every frame at runtime or be baked to static.

And then you can apply whatever material you want to that mesh to make the surface material appear or lit how you desire.

We no longer have this collapsed abstraction of “gotta use a vertex shader” where the only way to performantly deform a mesh is to hack it into the vertex shader which can do very little, even less in raytracing pipeline, and eventually heading to a future where vertex shaders will be deprecated altogether, in favor of mesh / compute shaders.

Is stream-out with a null GS an option? This would allow you to run some user-defined legacy vertex shader and get the modified positions to put into the RTAS. Don’t know how performant it would be if there were 10000 plants waving in the wind with alpha-tested materials, but it would be a seamless workflow instead of requiring compute shaders for mesh deformation.

Does this also mean VR is supported with ray tracing?