Hi Everybody
We’re back with our next release, Unity 6 Beta (2023.3.0b1 Beta)
As announced in the Unite 2023 Keynote, we are bringing back the clarity of our
original release naming by changing the name of Unity 2023 LTS to Unity 6. This means the 2023.1, 2023.2 and Unity 6 Beta (2023.3.0b1) releases will be rolled up into the new Unity 6.
Also, in case you missed it, check out our Unite 2023 roadmap session that covers the new features below, and more of what you can expect in Unity 6.
This new Beta release brings you significant performance enhancements Unity’s render pipelines to speed up production and more tooling and possibilities to boost visual quality across platforms.
In the Unite keynote we featured the demo Fantasy Kingdom Unity 6 which showcases performance boosts that are unlocked through GPU Resident Drawer, GPU Occlusion Culling, and the Spatial-Temporal Post-Processing (STP). This demo is a modified and expanded version of “Fantasy Kingdom” by Synty Studios.
GPU Resident Drawer
Efficiently render larger, richer worlds. Optimize your games with up to 50% CPU frame-time reduction for GameObjects when rendering large complex scenes without the need for complicated manual optimizations across all platforms including high-end mobile, PC, consoles. You can find more information on GPU Resident Drawer in the Forum post.
GPU Resident Drawer demonstrating reducing batches from 137K to 13K in the Fantasy Kingdom Unity 6 demo.
GPU Occlusion Culling
Working alongside the GPU Resident Drawer, boost performance of GameObjects by reducing the amount of over-draw for each frame, which means the renderer is not wasting resources drawing things that are not seen. This happens dynamically, and is especially effective when the scene has a lot of instancing.
Spatial-Temporal Post-Processing ( STP )
Optimize GPU performance and significantly enhance visual quality and runtime performance with this new state-of-the-art upscaler. STP is designed to take frames rendered at a lower resolution and upscale them without any loss of fidelity, delivering high quality content to platforms and with varying levels of performance capabilities and screen resolutions.
Render graph - URP
The new RenderGraph system for the Universal Render Pipeline automates runtime resource optimization, simplifying memory usage and enhancing performance especially on mobile GPUs. Its stricter API guidelines minimize rendering errors and performance issues, and the integration of the NativeRenderPass API and comprehensive debug viewer aid in efficient troubleshooting and resource management. Additionally, the new ContextContainer class offers streamlined access to rendering resources, improving ease of use and control in complex projects.
Foveated rendering for VR platforms - URP
The new Foveated Rendering API is fully supported by the Universal Render Pipeline allowing you to configure the Foveation Level, improving GPU performance at the cost of reduced fidelity around the user’s mid/far peripheral. Two foveation modes are available, depending on the XR device’s hardware capabilities. In the case of Fixed Foveated Rendering, regions in the center of the screen space benefit from higher quality, and in the case of Gazed Foveated Rendering eye tracking is used to determine which regions of the screenspace benefit from higher quality.
The Foveated Rendering API is compatible with the PS VR2 plugin and Meta Quest through Oculus XR plug-in, with support for the OpenXR plug-in coming soon.
Volume framework enhancements with Custom Post Processing
We are making the Volume framework a first class citizen for both URP and HDRP, optimizing CPU performance on all platforms to make it viable even on low-end hardware, and allowing users to set global and per quality levels volumes in URP similarly to what was possible in HDRP with an improved UX across the board. Additionally, it is now easier to leverage the Volume framework with custom post-processing effects with URP to build your own effects like a custom fog (check this demo from our December live stream to learn more).
C# Light Probe Baking API
With Unity 6 we introduce a new API for baking Light Probes. Baking no longer relies on the Lightmapping delegates, where the user gets callbacks during the baking process. Instead, the process is very explicit and free of side effects, enabling the user to control how many probes to bake at a time to balance execution time vs memory usage. We expose a GPU backend focused on high performance and using OpenCL internally.
More enhancements to Adaptive Probe Volumes
Sky Occlusion enables you to apply a dynamic time of day lighting scenario to your virtual environments, achieving more color variations of static indirect lighting from the sky as compared to APV scenario blending.
We have also expanded APV scenario blending to URP, enabling a wider range of platform support for you to easily blend between baked probe volume data for day/night transitions or switching lights on and off in rooms.
To better enable time of day scenarios, we improved sky rendering in particular at sunset and sunrise adding ozone layer support and atmospheric scattering to complement the fog at long distance, as well as adding support for transparent surfaces with mixed tracing mode to mix raytraced and screen space effects when rendering surfaces like water together with terrains and vegetation.
Because performance is key when rendering large dynamic world, we also optimized SpeedTree vegetation rendering leveraging the new GPU Resident Drawer mentioned above, and water interactions adding an option to read back simulation from the GPU with a few frames of delay instead of replicating the simulation on the CPU.
In our mission to make it easier to use VFX Graph and to use it on a wider range of platforms, we improved tooling and URP support. If it was already possible to profile the cost of a VFX, the new VFX Graph profiling tools allows a VFX artist to find what could be optimized within a graph. One of the power of VFX Graph is to create uber reusable effects, and a key aspect is to build a nice dashboard to control it, which is now much more accessible with easier to create built-in and custom attributes with the blackboard (browse and create attributes from the dashboard with a quick description, drag and drop attributes from the blackboard in graph or blocks, attribute usage highlights in the graph,…). You can as well more easily build uber VFX shaders with the support of Shader Graph Keywords, and build more complex effects with URP with now access to URP depth and color buffers for fast collision or for spawning particles from the world.
Improved quality of life for Shader Graph
Shader Graph is used in most productions and we wanted to address some of the top user pains when using or learning the tool on a daily basis with improved shortcuts, a heatmap color mode to quickly identify the most GPU intensive nodes in your graphs, and faster Undo/Redo. And for both experts and beginners, because it is sometimes time consuming to understand what a node does and how to use it, we added a Node Reference Sample containing a set of Shader Graph assets where each graph is a description of one node, with breakdowns of how the math works under the hood, and examples of how the node can be used (you can learn more in this video).
Quality of Life improvements to the Unity build window, plus all new build profiles
Many of you rely on custom build scripts to customize the way that builds are managed in the Editor. This is mostly because many of Unity’s build settings are global to the project and cannot be changed per build. Our new Build Profiles system is the first of a series of updates, designed to address this common feedback.
With these new Build Profiles, you can have multiple profiles for a single target, such as a vertical slice, one for your demo and one for the final build, each one containing different settings and scene lists. Managing builds will be more efficient, with a higher degree of flexibility than ever before.
We are also improving the Build Window to enhance platform discovery inside the Editor. Now, the platform browser is a place where you can discover all the platforms that Unity supports, get info on how to deploy to them, get documentation relevant to that platform and also how to apply for closed platforms. When you add a build profile from the new platform browser, we will ensure that all mandatory packages and settings are correctly configured for that platform, getting you up and running quicker than ever before.
Incremental build pipeline available on all supported consoles
The Unity incremental build pipeline allows you to iterate more quickly in-between builds. Once a scene has been built, subsequent builds will focus only on the delta, meaning build times drastically reduce in most situations.
When we tested the Boat Attack demo, build times dropped almost 50% in Unity 6 vs 2021 LTS after a script change, and building a scene again without any changes dropped from 40 seconds to just over 5. That improvement will now be able to deliver significant benefits to your iteration speeds during development across all major consoles.
Unlocking more mobile gaming
With Unity 6 Beta (2023.3), Android and iOS browser support has arrived. You can now run your Unity games anywhere on the web, including in iOS or Android browsers. You can also embed your games in a web view in a native app or use our progressive web app template to make your game behave more like a native app, with its own shortcut and offline functionality.
Here you can see our recent 2D sample project Happy Harvest running in Safari on an iPhone 15 Pro.
We’ve also introduced support for Android’s Predictive Back Gesture so your users can avoid unintended quits from your application by previewing that your game is about to close.
Early access to the WebGPU backend
The introduction of a new WebGPU backend marks a significant milestone for web-based graphics acceleration, paving the way for unprecedented leaps in graphics rendering fidelity for Unity web games.
WebGPU is designed with the goal of harnessing and exposing modern GPU capabilities to the web. This new web API achieves this by providing a modern graphics acceleration interface that’s implemented internally via native GPU APIs such as DirectX 12, Vulkan, or Metal, depending on the desktop device you use.
The demo available here takes advantage of GPU Skinning to mesh the skin of these robots to the skeleton underneath, while maintaining a relatively high framerate.
The WebGPU graphics backend is still in development, and we do not recommend using it for production use cases. However, it’s available now in early access. Details for how to get started, as well as additional WebGPU demos, can be found in the graphics forum .
Unity Editor support for Arm-based Windows devices
Unity delivered support for Arm-based Windows devices in 2023.1, enabling you to bring your titles to this new hardware platform and achieve rock solid performance and stability natively on ARM64 processors. To build on that, we are excited to announce native Unity Editor support for Arm-based Windows devices in Unity 6. This means you can work on your Unity games on even more windows devices, taking advantage of the performance and flexibility that ARM powered devices can offer.
DirectX 12
Unity’s DirectX 12 graphics backend is fully production ready, and recommended for use when targeting DX12-capable Windows platforms. This change is preceded by a comprehensive array of improvements to both rendering stability and performance.
Using DX12, Unity Editors and Players can benefit from significant improvements to CPU performance, via the utilization of Graphics Jobs. Performance gains are expected to scale based on scene complexity, and the amount of draw calls submitted.
Most noticeably, the DX12 graphics API unlocks support for a wide range of modern graphics capabilities, in order to enable the next generation of rendering techniques - such as Unity’s Ray Tracing pipeline. Upcoming features will make use of DX12’s advanced capabilities, ranging from graphics to machine-learning, in order to enable an unprecedented level of fidelity and performance.
XR Experiences
We continue to support a number of leading XR platforms including Apple iOS and visionOS, Android, Meta Quest, PlayStation VR, Microsoft Mixed Reality and more platforms through our support of OpenXR. In addition to platform support, we are working providing better templates to get started, and tools to add interactions, and working with environments.
Get started with templates
New VR, MR and Mobile AR templates found in the Unity Hub include example scenes with content that use AR Foundation and XR Interaction Toolkit to demonstrate controller and hand tracking, locomotion, interactive objects, UI, and world tracking features to jumpstart XR development for OpenXR, Meta Quest, Windows Mixed Reality, ARKit and ARCore supported devices.
XR interactions
XR Interaction Toolkit 3.0 brings multiplayer support that synchronizes movement and interactions. Updates to climb locomotion that adds ladder teleport anchors and multi-anchor gaze assistance to make climbing mechanics more natural. Virtual keyboard sample for XR text input across platforms.
XR hands 1.4 comes with a set of prebuilt gestures to help you get started. You can create new custom gestures by combining hand shapes and relative orientation. Updated debugging tools to tweak and verify set values against live tracking values to test gesture detection.
Multiplayer
The latest Unity 6 Beta (2023.3) brings enhancements for creators who build multiplayer games.
Multiplayer Playmode Version 1.0-pre
Multiplayer Play Mode (MPPM) enables you to test multiplayer functionality without leaving the Unity Editor. You can simulate up to four Players (the Main Editor Player plus three Virtual Players) simultaneously on the same development device while using the same source assets on disk. You can leverage MPPM to create multiplayer development workflows that reduce the time it takes to build a project, run it locally, and test the server-client relationship.
Compatible with Netcode for GameObjects & Netcode for Entities, focusing on local testing scenarios.
Dedicated Server package version 1.0-pre
Use the Dedicated Server package to switch a project between the server and client role without the need to create another project. To do this, use Multiplayer roles to distribute GameObjects and components across the client and server.
This package contains optimizations and workflow improvements for developing Dedicated Server platforms. For example, you can use the Dedicated Server package to mark all render components of a scene so they’re present only on the Standalone builds and removed in the Dedicated Server ones.
Multiplayer Tools version 2.1
Network Scene Visualization (NetSceneVis) is a powerful tool included in the Multiplayer Tools package to help you visualize and debug network communication on a per-object basis in the Unity Editor Scene View of your project with visualizations such as mesh shading and text overlay.
The NetSceneVis tool listens to network messages about the game state to visually display bandwidth and ownership on a per-object basis in the Unity Editor. The visualizations are updated in real-time as the network messages are sent and received, showing the game state synchronization across the network.
This tool can help you optimize and debug your network code to ensure that your game runs smoothly for all players.