Early access to the new WebGPU backend

Greetings from the Unity Graphics team.

We’re excited to announce that starting with Unity 2023.3, the Unity Web Player now provides experimental support for the new WebGPU graphics API.


The introduction of support for the new WebGPU API marks a significant milestone for web-based graphics acceleration, and paves the way for unprecedented leaps in graphics rendering fidelity for Unity web applications.

In this post, we share some useful information on how to enable WebGPU, along with various guidelines and limitations that should be taken into account.

WebGPU: The future of graphics and compute on the Web

Widely adopted and supported by all major browsers, the WebGL 2.0 API is the industry standard for web-based real-time graphics applications. However, as WebGL shares much of the feature set of OpenGL ES 3.0, it isn’t able to leverage all of the capabilities present in modern GPUs.

WebGL 2.0’s most notable limitation is lack of support for Compute Shaders, which are crucial for the mass parallelization and acceleration of general computations on the GPU. This shortcoming manifests as lack of support for newer (compute based) Unity features such as GPU Skinning, VFX Graph and more advanced GPU-driven rendering techniques currently in development. In turn, this significantly limits the scope and fidelity of Web-based Unity applications.

Furthermore, WebGL is based on the stateful “bind-and-draw” design of the OpenGL API. Modern graphics APIs all provide a lower-level interface that exposes greater control over the graphics context, and reveals new opportunities for the acceleration of CPU and GPU performance.

In order to address the limitations of WebGL, and enable the future of Web-based graphics acceleration, the W3 Consortium (with collaboration from industry leaders such as Google, Apple, Mozilla, Microsoft and Unity) recently published the specifications of the new WebGPU standard.

WebGPU was designed with the goal of harnessing and exposing modern GPU capabilities to the Web. This new web API achieves this by providing a modern graphics acceleration interface that is implemented internally via native GPU APIs, such as DirectX12/Vulkan/Metal. The native implementation in use will depend on the browser’s platform and available graphics drivers.

The new capabilities exposed by the WebGPU backend will unlock support for new and exciting rendering features, and enable a level of graphics fidelity previously unseen on the Web. Similarly to other modern graphics APIs, lower level control over the rendering setup and GPU execution unlocks new optimization opportunities, which could lead to reduced CPU and GPU overhead and improved throughput and latency.

A good example is the case of Compute Skinning support (previously unavailable on Web platforms) which can significantly improve the rendering performance of complex and detailed skinned mesh renderers, by offloading vertex transformations to the GPU. In the following test scenario, the WebGPU backend was observed to run significantly faster compared to the WebGL backend, when rendering a large amount of animated character models::


Compute Skinning in WebGPU (Click here to open a Web Player)

Note: WebGPU support is currently limited to compatible desktop browsers. See the “Platform Support” section below for more information.


WebGPU vs WebGL - Skinned Mesh Rendering Comparison

Another example is the introduction of Indirect Drawing support, which allows to procedurally generate geometry and drawing parameter data, directly within compute shaders. This technique can be used in order to massively-parallelize the transformation, culling, and rendering of complex geometries. This technique is often used when implementing advanced particle systems such as Unity’s own VFX graph. The following demo utilizes VFX Graph in order to efficiently simulate and render over a million particles:


VFX Graph on WebGPU (Click here to open a Web Player) - Development in progress


Enabling the new WebGPU graphics backend
Update: 19.12.2024

As of Unity 6000.1, WebGPU support is available in “experimental” state, and can be accessed through the Player Settings. For more information, please refer to the latest post: WebGPU Support in Unity 6.1

In versions 2023.3 to 6000.0, WebGPU is available in limited early-access, and requires additional configuration of the project settings:

  1. Navigate to an existing Unity project and open the ProjectSettings.asset file (located at <project folder>/ProjectSettings/ProjectSettings.asset) using a text editor.
  2. Search for a setting titled webGLEnableWebGPU:
  3. If found, change the line to webGLEnableWebGPU: 1.
  4. If not found, add a line webGLEnableWebGPU: 1. (This line can be added anywhere in the file, preferably next to other web-related settings.

Platform support

WebGPU’s specification has recently reached version 1.0, and active development is still ongoing. WebGPU is enabled by default today in Google Chrome on ChromeOS, Mac and Windows, and in Chrome Canary on Android.

Browser support is still in development, and we only recommend using WebGPU with Chromium versions 119+ and later, on supported desktop platforms. Driver related issues were identified on older browser releases.

Additional browsers (Firefox, Safari) may enable WebGPU via a developer flag. Please refer to the relevant browser’s documentation in order to determine support for the new graphics API.


Guidelines and limitations

Unity’s WebGPU graphics backend is still in development, and we do not recommend using it for production use cases. Support is currently limited to the Universal Render Pipeline:


URP BoatAttack in WebGPU (Click here to open a Web Player)

When experimenting with the new backend, please bear in mind that the following key packages and features are not yet supported:

  • VFX Graph (In development)
  • Forward+ Rendering Path
  • Graphics Jobs
  • Dynamic Resolution
  • Entities and Entities Graphics
  • High Definition Render Pipeline (and it’s various features)
  • Async Compute API

Additional features (not listed above) may be unsupported or unstable. Moving forward, we are working to stabilize and improve the performance of the new graphics backend. We eventually aim to provide compatibility for all features that could be supported by the new (and evolving) WebGPU standard.

You can follow our progress via the public roadmap. If you cannot find the feature you are looking for, feel free to submit a feature request, or contact the team directly in the graphics forum.

Please give the new WebGPU backend a try and let us know what you think!

49 Likes

I dont seem to get the option

  • Add “WebGPU” to the Graphics API list, and set it as the first priority above WebGL.

i followed all the previous steps and indeed it changes webgl to web, but i dont have this option, perhaps rather than “latest” 2023.3, it needs a more specific number as I have a13 which is the latest available to me?

@bugfinders After the project upgrades to 2023.3, edit the text file ProjectSettings/ProjectSettings.asset. Search for webGLEnableWebGPU. If it’s there, change the value to 1. If it’s not there, add the line webGLEnableWebGPU: 1, in the section with the other WebGL* settings. With that setting being set to 1, WebGPU should be available to you in the Graphics APIs list. It’s a hassle, but a temporary hurdle while the WebGPU driver is in active development. The recommendation to use the latest builds of 2023.3 possible is because we’re frequently pushing fixes and improvements.

4 Likes

As i said I did that… and NO it does not give me that option hence posting.

@bugfinders , sorry, didn’t mean to imply you didn’t, just trying to clarify what should be needed. I’m guessing things got into a weird state during the upgrade.

To verify, can you do this with a new project? a13 is the latest public alpha, it will have WebGPU in it, and a14 will also be out soon. You can DM me to iterate on figuring out how to get you going.

Just tried it on 2023.3.0a13, following your instructions and was able to enable WebGPU without a hitch. Even changed an existing project to use WebGPU (rebuilt addressable + shaders) and it was plug & play.

Bold comment is not required as the settings file will be automatically re-created and the line will be placed next to the other web settings… so it’s moot.

Overall, good job team!

2 Likes

thanks for the offer, coming your way cos this hates me… (its ok i am used to it but im keen to see it work)

Thanks for verifying, @KamilCSPS ! And you’re right about the moot point for the placement of the setting, I’m just paranoid :wink:

Initial feedback - first, ProjectSettings/ProjectSettings.asset looked like this…


…i realized it was probably Project Settings->Asset Serialization->Mode set to Force Binary, so I changed that to Mixed, but didn’t fix it. Then I switched to Force Text, let it reserialize everything and reimport, then I was able to edit the settings. Changed the property to 1, reopened editor, the WebGPU option was then available. Switched back to Force Binary and waited for everything to reserialize and reimport. Reopened editor, added WebGPU, removed WebGL 2, did a build (this being an 11.8 GB project - not including size of library - with 5 games to test). Opened the build in Chrome and… remarkably, only a few minor issues.

Games are all playable, but,

  1. when going full screen, some textures that are smooth in WebGL are more pixelated in WebGPU.
  2. some textures that are read from and written to others using NativeArray end up grey instead of textured.
  3. renderScale doesn’t seem to work properly when adjusted at runtime (shimmers pixels a bit while using an in-game slider between values of 0.5 and 2.0, but resolution doesn’t look like it’s changing).
  4. loading times are longer, some occasional hitching with new content coming into view that doesn’t happen in WebGL (appears related to shader compilation, based on previous experience with similar problems). but performance overall seems similar.

No exceptions in console, but a few interesting warnings, which might be connected to the above issues (this was just a quick initial test to see how things would go - not planning to distribute anything supporting WebGPU for another couple of years or looking into the issues further for now, mostly because i’m not doing anything that would benefit from WebGPU features at the moment).
9471877--1331299--warnings.png

Anyway, amazing that it’s functional and all 5 games i test my primary framework with are actually playable! Great to see this so far along! :slight_smile:

2 Likes

Thanks for the feedback, @adamgolden !

  1. Getting all of the texture filtering settings right is still something we’re working through. Also, some texture formats that are filterable in WebGL aren’t in WebGPU yet (via the spec). These things will improve.
  2. I’m not sure what you mean by read from, but if it’s GPU readback to CPU memory, like Texture2D.ReadPixels, then the WebGPU spec does not support synchronous readback and you have to use AsyncGPUReadback. I’m still trying to figure out the best way to make using regular readback functions produce an error or warning rather than silently doing nothing.
  3. I’ll make a note to look into renderScale.
  4. Google is working on optimizing the WebGPU shader compiler inside of chrome. These runtime hitches will magically get better over time as Chrome updates. We’ve been working very closely with Google on optimizing pipeline creations for content the size of Unity assets. Unity shaders are quite large compared to hand written WebGPU shaders.
2 Likes

For example, I allocate rectangles within an empty “texA”, later populating those rects from other rectangles within “texB”, “texC” etc., using NativeArrays like this,

NativeArray<Color32> dataA = texA.GetRawTextureData<Color32>()
NativeArray<Color32> dataB = texB.GetRawTextureData<Color32>()

…then iterating through pixels within a given rect and setting corresponding pixel in the other texture’s rect while doing that. In WebGL it works perfectly, but for some reason in WebGPU it’s either not writing the pixels to the rect as expected, or [more likely] not reading the pixels from the rect in another texture, and I say more likely the issue is with reading because in one of my games i’m only writing pixels through the same process and it does work in WebGPU. so, somewhere is an issue, could dig and maybe figure out what - but i will just wait a while and see if it’s better in the future, again not urgent / just wanted to take a quick run at this and see how everything went :slight_smile:

The texture filtering being at different levels of completion depending on format explains why the appearance of some textures was different between WebGPU/WebGL when in full screen. Also thanks for the other info and looking into renderScale… and I say “that’s all the testing I’m going to do!” …but actually I will try again on occasion to see how it’s going. Long term this is all very important stuff, but at the moment / in early days of WebGPU it’s still mostly a curiosity (to me anyway).

I just created a minimal repro for that, but it’s actually working fine - not sure why it doesn’t in my own projects (must be some combination of settings, or the texture filtering thing threw me off). so don’t bother, thanks anyway though!

1 Like

I would like to ask, is the VFX graph on the web will only work with webgpu right?

Or would it also work with webgl mode?

@adamgolden If you can DM me a repo with your use of GetRawTextureData not working, I can look into it. I just did a quick test, copied the example from the GetRawTextureData docs, and it worked for me.

WebGPU is still in the early days, but it will have a very important role in the future of graphics and computation on the web, so pushing its buttons and finding its quirks is useful.

1 Like

@Thaina VFX Graph will require WebGPU because it requires compute shaders. WebGL doesn’t support compute shaders, so VFX Graph can’t run on it.

1 Like

Actually working on a repro for that right now :slight_smile: the images I read are generated from base64 encoded strings, so may be somewhere else leading up to the point it’s actually reading the pixels where it’s different in WebGL vs. WebGPU. I’m in the process of taking minimum code into a fresh project and will post again when I work it out (or if I can’t and the repro works fine through every step).

So far I’m at the point in the repro where I am able to decode the image and assign it to a RawImage in WebGPU without issue. The next step is reading then writing to a new texture - if that works too, I think maybe what I will need to do is delete the Library folder of my project, reimport and rebuild and hope it just works lol …wouldn’t be the first time I was tracking down bugs that turned out to just be something wrong in Library after changing an engine version or whatever other significant things :stuck_out_tongue:

Repro was fine but I found the cause while putting it together - as you originally mentioned it’s a ReadPixels call at the root of this issue! I have a #if UNITY_WEBGL && !UNITY_EDITOR block used with WebGL only - I wasn’t doing things the same way for other platforms, which I hadn’t noticed earlier. I can do the same thing without using ReadPixels though (just NativeArray like the other platforms). I’ll try unifying this part of my framework, hopefully no surprises (because I must have had a reason to do that platform-specific approach originally).

1 Like

Yes!! Okay, so everything is working now in WebGPU, at least they’re not grey rectangles anymore - the reason I had switched to ReadPixels from only doing a NativeArray copy was a texture format issue, where I could avoid what looks like ARGB32->RGBA32 issue (texture channels mismatch). That’s unrelated and something I will just have to solve properly.

Also, it turns out [embarrassingly, given my previous post] I had also made the same change in the non-WebGL part and my assumption that it had something to do with reading then writing texture was just me not remembering what I had done :frowning:

So, thank you for all the responses and help @brendanduncan_u3d - this is a huge achievement for you guys, I can’t believe it’s actually all working, this is really impressive to see. Congrats and thanks again!

Edit: Also solved format of source images thing: rgba32.SetPixels(argb32.GetPixels()) :smile:

2 Likes

Thanks @brendanduncan_u3d who helped me with a couple of sticking problems…

So on my dungeons game i got a mixed result - it loaded, but, while it said fps was about the same as webgl the movements etc were like it was really unhappy and like really lows fps feel… i then wandered over to see my skele in a cage and the screen went black and well that was the end of that, it stayed black… I will have to check but as far as i know it only uses simple shaders, so not sure whats going on… ah - had to turn some batching on, ok now it randomly goes black but you can move and get it often get it back… so something odd but… i wouldnt say the fps is higher… but then im using the built in renderer due to some issues with urp light leaking…

ok so when it goes black i get 80 bajillion messages of
Validation: [Invalid BindGroup] is invalid.

  • While encoding [ComputePassEncoder].SetBindGroup(0, [Invalid BindGroup], 0, …).

for my next game (which is a gems game) - wont compile the shader is a shadergraph one, very basic.

Shader error in ‘Shader Graphs/graduating’: error: struct member hlslcc_mtx4x4unity_ObjectToWorldArray not found (on webgpu)
Compiling Subshader: 0, Pass: MotionVectors, Vertex program with INSTANCING_ON
Platform defines: SHADER_API_DESKTOP UNITY_ENABLE_DETAIL_NORMALMAP UNITY_ENABLE_REFLECTION_BUFFERS UNITY_LIGHTMAP_FULL_HDR UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_PASS_MOTIONVECTORS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BLENDING UNITY_SPECCUBE_BOX_PROJECTION UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS
Disabled keywords: DOTS_INSTANCING_ON SHADER_API_GLES30 UNITY_ASTC_NORMALMAP_ENCODING UNITY_COLORSPACE_GAMMA UNITY_FRAMEBUFFER_FETCH_AVAILABLE UNITY_HARDWARE_TIER1 UNITY_HARDWARE_TIER2 UNITY_HARDWARE_TIER3 UNITY_LIGHTMAP_DLDR_ENCODING UNITY_LIGHTMAP_RGBM_ENCODING UNITY_METAL_SHADOWS_USE_POINT_FILTERING UNITY_NO_DXT5nm UNITY_NO_SCREENSPACE_SHADOWS UNITY_PBS_USE_BRDF2 UNITY_PBS_USE_BRDF3 UNITY_PRETRANSFORM_TO_DISPLAY_ORIENTATION UNITY_UNIFIED_SHADER_PRECISION_MODEL UNITY_VIRTUAL_TEXTURING

(see, Im cursed!)

so having removed my graduating background, it compiled, but first swipe… nope - hung… nope not hung, just not updating the tiles on the grid… oooh its like solving a rubics cube without sight only you dont know what the new gems are… awesome

poke poke
Error building Player: Shader error in ‘Universal Render Pipeline/Lit’: error: struct member hlslcc_mtx4x4unity_ObjectToWorldArray not found (on webgpu)
Compiling Subshader: 0, Pass: MotionVectors, Vertex program with INSTANCING_ON
Platform defines: SHADER_API_DESKTOP UNITY_ENABLE_DETAIL_NORMALMAP UNITY_ENABLE_REFLECTION_BUFFERS UNITY_LIGHTMAP_FULL_HDR UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_PASS_MOTIONVECTORS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BLENDING UNITY_SPECCUBE_BOX_PROJECTION UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS
Disabled keywords: DOTS_INSTANCING_ON LOD_FADE_CROSSFADE SHADER_API_GLES30 UNITY_ASTC_NORMALMAP_ENCODING UNITY_COLORSPACE_GAMMA UNITY_FRAMEBUFFER_FETCH_AVAILABLE UNITY_HARDWARE_TIER1 UNITY_HARDWARE_TIER2 UNITY_HARDWARE_TIER3 UNITY_LIGHTMAP_DLDR_ENCODING UNITY_LIGHTMAP_RGBM_ENCODING UNITY_METAL_SHADOWS_USE_POINT_FILTERING UNITY_NO_DXT5nm UNITY_NO_SCREENSPACE_SHADOWS UNITY_PBS_USE_BRDF2 UNITY_PBS_USE_BRDF3 UNITY_PRETRANSFORM_TO_DISPLAY_ORIENTATION UNITY_UNIFIED_SHADER_PRECISION_MODEL UNITY_VIRTUAL_TEXTURING _ADD_PRECOMPUTED_VELOCITY _ALPHATEST_ON

perhaps not…

cant say i have had a lot of succes… but i do look forward to seeing how it does

PS - loaded the URP version of dungeons 100% same behavior (curious)

I expect Safari will have WebGPU enabled without the developer flag for the very first time no earler than the release of iOS 22 (~5 years from now). :roll_eyes:

Apart from that the next five years are going to be exciting for web developers! :slight_smile:

3 Likes