Feedback Wanted: Streaming Virtual Texturing

Hi!

Streaming Virtual Texturing is in preview for Unity 2020.1 from beta 14 and up. You can download the sample project here. The sample uses the HDRP 9-preview.33 package that you can find in the package manager when enabling preview packages in the project settings.

Streaming Virtual Texturing is a texture streaming feature that reduces GPU memory usage and Texture loading times when you have many high resolution Textures in your scene. It works by splitting Textures into tiles, and progressively uploading these tiles to GPU memory when they are needed.

You’ll need Shader Graph to add Streaming VT to a shader. The online user guide here goes into the details of how to set up your project for VT and how to author your content.

The sample project shows a basic scene with roughly 1GB of compressed 16K, 8K and 4K streaming textures. Streaming is enabled by adding the textures to a Sample VT Stack Node in ShaderGraph. Memory is optimized by setting the textures to “Virtual Texturing Only” in the importer. The system allocates 384MB of GPU caches to stream in the texture tiles.

Try out our sample project and let us know what you think by replying on this post.
We look forward to hearing your feedback!

Aljosha

17 Likes

What are the cons for using virtual texturing? When wouldn’t I want to use it?

2 Likes

You want to use it for open-world type of games or corridors but with highly detailed environments, non-repeating materials, mostly first-, third-person games.

So if you not fall into this category, you can skip them.

In general virtual texturing means higher load in artistic department in the first place, since you have to create a lot more, unique textures and materials. Obviously it’s not first choice for mobile platforms, mostly because of the space requirements and latency in loading up required texture tiles from the storage, which breaks gameplay.

There’s additional cost related to computing which part of giga-texture-atlas engine should currently load up from the storage and make available for the shaders. Shaders are little bit more complicated and sea of storage is required to store full PBR materials.

On consoles and PCs it’s forgettable cost however, so it’s in use for a decade already, at least.

Took unity some time to integrate it from the company they acquired, few years back.

Anyway, great that it coming to unity. Hopefully together with camera stacking….

3 Likes

It’s good to understand the technical limitations and what you can gain from it to be able to draw your conclusions when to use it.

main gain is:

  • you can use higher resolution texturing with very little GPU RAM, meaning sharper texturing on GPU’s that have less physical VRAM.

potential downsides are:

  • since the VT streams in texture data based on what’s currently rendered on the screen, quick scene changes or fast camera rotation can momentarily display lower resolution data on your game world until the streaming catches on. This also happens with traditional texture streaming. I think old Granite integration used to have API call you could use to give VT system a hint that you’ll be switching to different view soon, not sure if this new system yet has something like that (it’s especially useful for things that you know will happen in advance, like camera cutscenes for cinematics etc)
  • since you can use higher resolution textures, the obvious downside is that they can take a lot more disk space. This will be way bigger issue on say, open world game than in say, game that just has more limited environment.

I’m suffering with Texture Streaming in Unity 2019.3. For distance object, it just fine but for near object first spawned object shows lowest mip level and it streamed to high mimap at next frames. User can see texture flickering in this timing. I think that same problem can occur with virtual texturing. Will Unity provides APIs for fine tuning and handling these problems?

1 Like

I mentioned this on the above post. For regular texture streaming they let you preload the textures for new camera position, see “Camera cuts” section on this doc: https://docs.unity3d.com/Manual/TextureStreaming-API.html#ControlCameras

2 Likes

Streaming Virtual Texturing works for any game type. It makes most sense when you have dense scenes. These could be large worlds that are densely populated or compact scenes. With dense I mean many objects with high res textures close to the camera at the same time. Mipmap streaming would struggle with this because it tries to load entire texture mipmaps based on the distance from the objects to the camera. Virtual Texturing only loads the areas (tiles) of the texture mipmaps that are actually visible. In most frames textures are only partially visible (front of a character for example) so much less texture data needs to be in video memory than when using mipmap streaming.

You can see in the image below that only a fraction of the 8k PBR textures (+displacement) are visible in the shot. The white lines represent the individual texture tiles that are loaded by the VT system. You can toggle this debug view in the sample scene I shared in the first post and see how it works exactly.

It is certainly correct that SVT can handle many high res textures which obviously take time to create by artists. It been used for photogrammetry content because there you automatically have a lot of unique high res content. However, what I’d like to point out here is that even if you have a limited set of high res textures, you can use them more freely everywhere in your scene. You can have 20 objects with four 8K textures each right in front of the camera and still use very little video memory without any texture quality loss.

The downside of using the current VT system is that it requires CPU and GPU cycles. Roughly speaking you need to allocate 1.5ms per frame to using SVT. You get memory or texture quality at the expense of these 1.5ms (it can be lower, it depends on your hardware). The best thing to do is budget this from the start of your production if your game is designed around using high res textures. It’s also pretty easy to convert your project to using SVT so you can quickly experiment and see what the performance impact is versus the memory gains you have. In Conan Exiles by Funcom they discovered that they could have their ultra high texture quality on devices with limited video memory. The __compact user guide __explains in more detail how the VT sampling works so you can understand where the performance impact comes from.

The automatic requesting works really well but you can indeed request texture tiles to be streamed from script. This allows you to prefetch textures mips or even subareas of mips before they become visible. Camera jumps, etc. can be anticipated this way. The sample I shared earlier actually has an experimental script (on the camera, it’s disabled by default) that prefetches some higher mipmaps (low res) of objects based on camera distance. This way, the VT system always has some texture data to sample. Texture tiles from mip 0 or 1 are then exclusively automatically requested by analysing the actual visible texture tiles in screen space.

17 Likes

Exciting! wonder if lightmaps will take advantage of this even if at future and if stuff like impostors could take advantage too.

1 Like

For sure. At the moment i use 8K Lightmaps.
So i have around 12x8K custom Lightmaps . One per interior PBR Material.

The problem is that PLM never can calculate these quality and sizes.
You have to use external lightmappers.

Check the Unreal implementation. Lightmap streaming done right.

Would be nice if you could provide an
Experiemental Fontainebleau branch with Virtual Texturing implementation.
Then we must not do all the testing alone.)

Please use Adressables from beginning in your development.
For projects with Virtual Texture Streaming is quite obvious.

Sorry. I am still stressed out because all Granite projects we had to stop last year.
You also were loosing some good customers with reference projects to EPIC because of this more than one year of information blockade.

However. Please try to get it in preview fast in a package who fits the latest Unity 2020.1.0bx + HDRP 8.x-Preview. You referencing an HDRP 9.x-Preview?
We could not wait any longer too and because of this i must mirrror our complete pipeline on unreal for an test.

1 Like

@AljoshaD How do we enable the the debug tiles on this (which are enabled on all your screenshots)?
I saw GRA_DEBUG_TILES in GraniteShaderLib3 but it’s actually commented out so it will not do anything. It’s kinda hard for users to tell what virtual texturing is doing without ability to visualize the tiles.

1 Like

Hi.
How i can enable the old granite debug view or the debug view showed in the user guide.

I found this one but its hard to interpret.

added:

old DebugTiles are here
VirtualTexturing/Debug T iles

2 Likes

keeponshading, sorry to hear that the wait for the new integration caused problems for you. Unfortunately it was impossible to estimate when we were going to be ready. I hope it now works well for you and I look forward to hearing your feedback!

Please also see this as an experimental/preview feature for now. It is built on the Granite SDK runtime so it is well tested with big shipped game titles. However, we re-did the whole texture import workflow to remove any built steps and redundant files. So the production workflow is much nicer than before.

We don’t support assetbundles/addressables for now but it is our highest prioritized feature. We really want to get this in for 2020.2. Please take a careful look at the bottom of our user guide where we mention the current limitations.

So this is a gamechanger for stereo equirecangular VR applications because you can do now uv reprojection passes
in R32G32 or two in R32G32B32A32.
They need this precision and minimum size of 8K better 16K.

edit:
ooh i saw under Important Notes they are not supported.
So please support them. This would be an real game changer.

Are UDIMs going to be supported eventually or is it a limitation that will hold due to the approach taken on the texture import workflow?

Hi. I did a very fast test in the example scene with Adressables and i seemed to work on a build.

5606536--580162--Adressables.JPG

However. This need some further testing.

As an Use Case example here some dots why Adressables are important

We have large data

  • so car interior is up to 4GB
  • car exteriror is up to 3GB
  • car tracks are up to 20GB and splitted through multi scene and Adressables.

Everything is loaded/unloaded per Adressables and the Build is now very confortable (4GB limit)
Now we could use even larger scan textures for cars and tracks who were on hold for 1.5 yrs.
Combined custom Lightmaps up to 64k.
Scanned Track and Car Textures up to 16K for primary scene views.

So a working and solid Addressable compatibility would be very important.

Also the custom Lightmaps are always in the process of improvement. Even after project end.
When there is free GPU/CPU time they will be auto improved up to FullGI GroundTruth for different lighting situations.
So they are marked as Adressable and often renewed.

The old WETA Demo who s an really perfect VT example on mobile phones was using UDIMS

https://graphinesoftware.com/blog/2017-12-12-augmented-reality-ARHorse-razor-sharp-glance-into-future-of-AR

We start to switch to UDIM now because Blender got an nice implementation.
So i hope UDIMS will be on the List too.

I also tried this briefly on DX12 and DXR. Some of the raytracing effects seem to work with VT, some throw errors and for example raytraced reflections just render pitch black.

1 Like

I just imported the packages in our project on 2020.1b2, virtual texturing is activated in the package manager, but a console info says it’s disabled and I neither get the option in the player settings nor in the pipeline asset.
I already tried disabling and reenabling the package. (the sample works in 2020.1b2 by the way)
Does someone have an idea what could have gone wrong?

use this packages in your project
\StreamingVirtualTexturing\StreamingSample_2020_0_0b2\Packages

and it should work

thats exactly what I did, I can also see the virtual texturing menu in the toolbar, but the option in the player settings is just not there…