Are you sure you look in the right place? It’s right at the top of Player settings on same place where you set company and product name, default icon etc.
Thanks, that was it , I was looking in “other settings” where these things usually are (and also in the compact guide).
Yes there is some UI we still like to change. We’ve documented those in the boxes in the user guide.
In one of the next betas the VT module will be hidden and you will need to remove it from the package manifest. It will always be on and you’ll only need to toggle the project setting.
On UDIM, it is not supported currently. It’s on our list but with a lower priority. That might change but looking at the current roadmap it is unlikely that it will be natively supported in 2020. Same for raytracing.
Feature request: Universal Render Pipeline support.
Without UDIM support this will be basically pointless for our project. If it had UDIM support we would be using this for our mesh terrain.
Building with IL2CPP doesn’t work, Unity tells me that the c++ tools and windows sdk have to be installed, even though I have everything installed. Is this a known bug, or did I miss something?
Update: never mind, seems like something was broken with my vs installation.
First of all I wanted to say it’s amazing for this early state how easy to use it is (even more comparing it to earlier versions of granite and amplify texture). You guys did a great job of integrating it thoroughly into the known workflows.
I just built a scene with almost 50gb of texture data (most of the textures 8k or 4k), and loading times are almost non-existent, everything looks great and is very performant.
There are some situations were textures just don’t appear though. Here is a screenshot which shows the problem and the granite error in the console. One thing the incorrectly textured characters have in common is that they are spawned in runtime. Might this be the reason for it? Do you know any way to fix this? Besides of that it’s very usable already.
Thanks for the feedback Onat! That’s great to read!
On your issue, I can’t think of a reason you are seeing this. Could you file a bug so we can reproduce it? If you share the link here we can jump on that immediately Unity QA: Building quality with passion
Hi dreamerflyer, on the first screenshot, the errors tell you that you need to assign a texture to each texture slot on your Sample VT Stack node. Or you need to create a node with less slots.
On the memory, you might need to set your caches smaller. Also, if you don’t enable “Virtual Texturing Only” on the texture importer then Unity will keep the texture in memory ontop of the texture tiles that are streamed with VT.
We are looking into the last screenshot.
I did not change anything about this example,and the textures 's “Virtual Texturing Only” is enable .
I couldn’t find the “support for VT” in player settings even enbabled the built-in module “Virtual Texturing” in the Package Manager. unity 2020.1b4
I found it finally.^^
It’s under the Icon setting.
16K is the maximum for a single object? I’d like to use at least 32K texture, since I’m making solar system planets and I need as much resolution as I can to get sharp views from low orbit.
Are there any guide for how to use it for terrain or HDRP lit shader?
Currently 16k is the limit as we are dependent on the max texture size of the engine. Most hardware only supports up to 16k textures. In order to offer compatibility to non VT we have to abide by this limit. Ideally what you need is UDIM texture support so you can use multiple large textures on the same mesh. We had this for the old plugin and intend to bring this to our new solution in the future but for now it’s not concretely planned on our roadmap.
There is no built-in support for the HDRP bundled shaders. It makes more sense to quickly roll your own in shadergraph based on your personal needs.
Then I can’t get the point. 16K is not that much. I’m already using a planet shader that combines 4 images to get a 16K texture, and it runs smoothly even on mobile. Today’s hardware is very powerful. Efficiency is OK but breaking the limits is the exciting thing. The old plugin should still be available for those who need that feature.
Apple A9 GPUs (iPhone 6S and 6S Plus) introduced texture support for up to 16384 px. Earlier iOS devices support up to 8192 px only.
There is currently no iOS GPU that supports textures larger than 16384 px, see https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf for details.
Unfortunately, Android devices are so diverse, that there exist no single feature set table like on iOS. I would assume similar or slightly worse limits on recent Android devices.
Here is a brief overview how much memory different texture sizes cost at 8 bits per pixel:
-
8192x8192 @ 8bpp = 64 mb
-
16384x16384 @ 8bpp = 256 mb
-
32768x32768 @ 8bpp = 1 gb
-
65536x65536 @ 8bpp = 4 gb
I guess, even just disk storage wise it gets unpractical on a majority on mobile devices to have 32k / 64k textures.