A number of you have been asking about Occlusion Probe system mentioned in the 2018 release blog. There seems to have been some confusion generated by our messaging here. I hope I can offer some clarification.
Presently Occlusion Probes are not ‘a feature’ and do not exist officially on our roadmaps. Rather they are an example of how the Custom Bake API in 2018.1 can be used to extend the Progressive Lightmapper.
While an experimental implementation of the Occlusion Probe system exists, the 3D texture data generated by the probes is not handled natively in any standard Unity shaders. To use the data, you will need to write your own shaders - and more to the point - be brave enough to modify Unity’s lighting functions.
We hope to have a blogpost and an example project available for public release soon. This should hopefully shed some more light on how to implement the Occlusion Probe system in your own project. Again though, this is an entirely experimental feature and is not offered with official support.
There is early discussion underway for how or when we may include the feature in HDRP in an official sense, but it is too early to say more or offer dates. More news as it becomes available.
Not going to lie, this feels like a big kick in the guts. I’m an artist who wants to demo outdoor environments with this fidelity but I need to programmer to decipher it all…This is not what we were led to believe from the blogs and keynote.
Yup. The blogs etc have also been very clear that this is future stuff, stuff that’s WIP and so on, so be careful of assuming, it’s unfair on you, unfair on Unity.
So what I am doing is collecting a list of information so that I can share with you, when these kind of posts pop up, which will help fill the documentation void. As a lot of this stuff is subject to change you can probably sympathise a little with Unity staff. There’s WIP docs on github (various wiki’s) you can check out meanwhile: https://github.com/Unity-Technologies/ScriptableRenderPipeline/wiki not perfect, but also not too shabby. HD has until the end of 2018.3 to be called anywhere near beta so I guess we’re all just showing our excitement. Can’t blame anyone.
Thanks, but at the same time please think about artist who work without programmers in unity. Don’t assume we can just code all the things that could be enabled with a button checkbox and a shader.
I’m pretty sure they will never make that assumption, especially since people are not afraid to speak up when a feature they want is omitted!
Honestly, I dont think non-graphics-programmers should be afraid for the future just because Unity will be promoting the various systems that are being opened up to programmers. They want to cover all the bases, but it is natural that in some cases Unity can deliver the API for programmers far quicker than they can deliver full feature in one of their own render pipelines. Especially given the early stage that the LW and HD pipelines are in. But I’m sure Unity know that the render pipeline makes up a notable part of a game engine, and that their own pipelines will have to deliver, out of the box, over the years to come. But that wont stop them mentioning that programmers could do something themselves in the meantime - this doesnt help everyone but it does help some people, and its not something Unity can really just use as an excuse never to provide most key features themselves.
I do think there may have been some marketing/presentation/blog mistakes made by Unity this year though, in that some of their stuff may have made the pipelines and certain features sound more ready than they really are. It’s hard for me to be sure because I’ve been following the technical reality via github more than the hype, and some perceptions will end up out of sync with reality even when Unity deliver their messages with complete accuracy.
Here is a good post by Robert Cupisz about occlusion probes. x.com
You can see how much difference occlusion probes make and without them you wont come even close to the visuals of Book of the Dead demo.
By reading his feed you kind of get an idea that occulsion probes are experimental and may take a while until it becomes an official feature in Unity. But the problem is that no one knows about that (apart of us who hang in this forum or follow certain people on twitter).
We start to get these silly Unreal vs Unity graphic youtube videos and on the Unity side the video is misrepresenting a bit since sowcase is built on custom stuff that may (or may not) end up in the final engine in near (far) future.
I love Unity. It’s basically my entire source of income at this point. However, let’s be very clear in our understanding … Unity runs on marketing … its a “company”. When they release a new high tech video … its intent is not to show off what the vast majority of their user base will ever be able to accomplish, nor is it what their engine actually supports without being modified. Its there to show off their shiny new features as well as what is possible to be done with Unity in the most hype building way possible. Anyone is speaking above their pay-grade if they try to pretend otherwise. That doesn’t take away from the work of the Devs (which is awesome), nor what we’re capable of doing with Unity … but it does mean we have to remember that not everything the company does is geared towards any individual user, but rather, towards producing numbers, like any other business. Thus, it goes without saying that for any given announcement, we won’t actually know what it ends up being until we’re using it.
It was based on this that I made a post yesterday asking for a tech demo that is not based on this kind of stuff, but it was immediately locked (and I totally get why it wasn’t really constructive or going to yield anything constructive).
I really just want a decent idea of what is capable without modifying the engine.
I happen to agree with you. It would be nice if we also got a demo of how to best solve the issues these tech demos faced without relying on extended scripts we may never get. I would love to know how an accomplished artist would build a scene using the tools we will have available, since that’s the only thing I can actually try to learn from. But … I don’t think that’s the purpose of those demos.
Well, at least the occlusion probes are supposed to work (do they really?) on the engine version we got. It’s not like first Adam tech demo that had custom tech we got partly only years later.
That being said, it would be cool if the tech demos would be runable on the same engine versions we get, otherwise they are pointless as they don’t represent the engine they are supposed to represent.
I’m guessing we’ll be getting some example how to hook the occlusion probes up once Unity releases the example map using book of the dead assets. It was supposed to be out in April, then last month so I guess we might get it this summer?
I did look at the occlusion probes github code earlier and there weren’t that much to setup, there just wasn’t any example or instructions so having a sample project (even super simplified) where everything is configured to use the occlusion probes would help people getting started with them.
Above link only contains shaders, not the whole implementation. You get the additional scripts from github branch but I haven’t tried combining these. I’d really love to use this feature on legacy renderer so it’s nice to see that shaders have been implemented already.
Exactly. And that is frustration with the current state of Unity. These demos are just a marketing tools for the engine and nothing more. Even the older demos that have been out for years. They were also marketing fluff since most of the stuff like atmospheric scattering and that undocumented complex shader that they used on their mesh terrain are constatnly breaking and not officially part of the engine. I would not be surprised if the same happens with Book of the Dead demo. Does not matter if they release the source project or not. Chances are that extracting useful knowledge from it will be limited.
If you want to learn and grow as an artist who specialize on real-time stuff then Unity is hard choice to make. The amount of education is so limited and you have to waste tremendous amounts of time (and money since you are likely to rely on custom shaders and tools from assetstore) on trial and error to figure out the workflow.
Just today quixel made a 5 minute quick tutorial on how they created gorgeous snow material and how it was set up in UE. It all seemed so easy, it was just working and the result were amazing. Everyone could recreate it easily and learn from that. But try to find something similar for Unity and you will not find anything. Quixel had a similar video for Unity but it was a complete hack.
Hopefully when the time is right and HDRP is officialy released there will be a “monkey see monkey do” type tutorials where we all have the same tools and can follow along and get visually amazing results. Something that could serve us as a good starting point to learn and grow.
I am currently learning by watching artists work on their UE scenes and try to apply their techniques into Unity where i can. I am using Alloy, Uber and ASE and sometimes i can get pretty close to their result. But as a Unity user i feel kind of silly to educate myself this way. A constant uphill battle.
Stuff like occlusion probes just not for the dull designers hobbyists. It’s all about some companies that can allow itself a tech artists. By the way Unity is not for some lone hobbyists making their own huge photoreal sandboxes. Maybe not so big not so photoreal sandboxes…
Or you just have to be generalist. Or maybe you’ll collaborate with some tech artist for making your common dreams come true.
But even those hobbyists would be able to use it if there were organized samples and documentation (we still have hope for that sample).
Thing is, Unity itself has promoted occlusion probes on 2018 so people got their expectations higher than they should have. You can’t really blame the users for getting excited or claim people should hire tech artists to be able to pull something off when unity has said you can do this with upcoming version.
I’d be just happy to see fully setup example scene myself, I don’t care if it’s not fully polished or not implemented as stock unity component as long as it works (I prefer having more of the source code around so it’s even better this way IMO).
“Each probe is just a scalar of how much the sky is occluded at a given point. It’s stored in a 3D tex spanning the entire forest and used to occlude direct sky contribution. We do some tricks on the lightmapper and the script side to avoid self-occlusion and other artifacts.”