Nano Tech - Something similar to Unreal Nanite

I have no ideal about the tech behind, but looks very similar to Unreal Nanite.
Any thoughts?

Also author (Chris Kahler) replies to one of commenters,
Q: Would this be on the asset store?
A: Maybe next year, it also depends on how many unity users want it. I was more thinking about a Patreon campain with github access.

Edit:

8 Likes

Yes this does look incredibly similar to nanite, no idea under the hood what differences there are but the overall approach seems the same

Which begs the question, when does unitys internal version of the same thing come out? :stuck_out_tongue: If one guy is making this unity must be at least looking into it after all the buzz nanite has created right?

Nanite might be UE specific but the overall approach will become a standard across all engines and I would be excited to hear / see something about what that may look like in unity

6 Likes

My uninformed guess is this can be extracted from the WETA pipeline.

1 Like

No real time and cinematic render farm based rendering are too different beast altogether, the culture toward asset is different, in render farm culture, you are not looking to optimize performance (money does it), you are looking to optimized visual and workflow speed. Which is why the paradigm shift of real time cinematography is happening.

Apparently you are not taking into account they have a real time renderer to preview with prior to putting it through their cinematic render pipe. You are trying to educate someone who has been studying or building sfx since the manual days of the 80’s and involved with game engine tech since 2009. I made my first stop motion 16mm film in 1971.To believe that what was purchased at 1.6B USD by Unity from WETA will never make it into the real-time pipeline is naive to say the least.

1 Like

For the answer to this question just remember when SEGI was all the rage and everyone was wondering when Unity’s internal version of fully dynamic GI will come out.

2 Likes

I’m not trying to educate you, i’m pointing at something you should have known given your experience, since you also didn’t consider that real time in movies is different than real time in games.

That is a good point! I think this goes above the level of hype that SEGI generated though, as this time its specifically their competitors tech (and currently only available there commercially to the masses).

I am hoping that gives unity a good kick up the behind to get into gear on this issue, but yes I suppose best to not hold my breath :smile:

Interesting, those Nanite examples are amazing. It would be pretty cool in VR where normal maps don’t work. I think if I was going to use something similar in Unity it would really need to be officially created by Unity.

I feel like creating something that looks similar wouldn’t be too difficult. Break a high resolution mesh into LOD clusters with instanced materials. Then maybe add my own LOD level to swap these objects so the cluster size changes.

Obviously getting something that performs at the level of Nanite is a different matter. For example, Nanite only loads data required to render the scene. It also compresses the data e.g. 1 million triangles compressed to 14mb. In Unity a similar mesh would probably be around 75mb and that’s before creating clusters and LODs. I’m sure there are also a bunch of details I’m missing like how it prevents gaps when transitioning between a high and low resolution cluster. Wonder if the source code for Nanite will be included in UE5.

Personally I don’t think Unity will implement this. Unity seems to focus more on mobile games and fast prototyping. Although I guess this might change since Unity purchased Weta like ippdev mentioned.

1 Like

you can look at it here:

https://www.youtube.com/watch?v=eviSykqSUUw

2 Likes

It’s the toolchain not the technique that’s the problem. You want to be able to generate these really quickly and I’m fairly sure there will be a ton of edge cases to deal with and research on that. The brute force way would take too long.

2 Likes

Thanks for the video GimmyDev. Looks like it was way more complex than I thought. If that Nano Tech is really using a similar technique then I’m really impressed. Generating the required data sounds really involved like hippocoder mentioned. I wonder if the person that made the Nano Tech demo used the UE5 toolchain, exported the data and then imported it into Unity. Then they’d only need to implement the rendering techniques.

There is always the solution to simplify the problem, by enforcing modeling guideline, to make the problem domain easier. Nanite is a kind of ā€œoptimizedā€ brute force to decouple concern from artist to tech, which lead to an over engineering solution that is too generic. Over engineering over generic solution is what big company do, because at their scales of resources it’s teh most competitive things to do, since it mean less training for artist, and it also help with solution like photogrammetry or filmic mesh (ie big polygon soup mess), by essentially automating the conversion workflow. It’s also clever because it can be seen as a form of potential lossy compression (you could simply cull the small leaf), and it’s byte sized is favorable to streaming.

But the same ideas over a less generic version, say that need strict quadmesh modeling, could be an option too. The consistency at modeling time would simplify the algorithm (clear boundaries) and make it more predictable.

Bu this is a world where nanite already exist and the method documented… Another less generic implementation pushing issues down the workflow isn’t competitive, unless we are speaking for scrappy specific small project who want to take a risk tailored to the nature of their projects.

I can see unity evolving this into their own Nanite solution:

which kinda looks like old ROAM algorithm

and this is similar to to the less generic solution I was talking about, even though it’s about decimation:

2 Likes

The thing is the fidelity being solved here is not within typical reach of AAA. When you listen to what the Coalition said about assets, you know they’re already forced to reduce polys in order to sustain dev times.

So given you need to reduce polys and yet still have enough to qualify using resolution independent tech, you’re looking at the real problem being your own budget to author enough high quality source art to make it worthwhile.

Right now, every ā€˜pushing the envelope tech’ requires more, not less work in order to make the most of it, and we can’t even come close to saturating this tech as we can’t source the assets for it.

And if we did source the assets for it, we would still be needing to source everything else that sustains this level of detail. I can’t help but think this is an interim polygon-chasing fancy, and for indies at least, some form of deep trained image enhancement is more effective. And even for AAA eventually when quality is sufficient.

All this polygon hunting (which is essentially what it really is), isn’t the future.

2 Likes

Who thought traveling around the world with expensive DLSR and color card was easier than sculpting zbrush in mom’s basement?
/joke :stuck_out_tongue:

Nanite also ain’t got you the lion king movie, they haven’t solved fur yet :smile:

1 Like

New tech always means more work. It increases productivity so then the boss expects more productivity.

Like in the army, all the modern gear is way lighter than anything ever before. But soldiers now carry more weight than any point in history. THat’s just the stupidity of humankind. We keep creating more problems. All the actual problems were solved ages ago. Now every problem there is is our own doing.

Anyway, I dont care too much about new tech coming out, but I’m thinking I’ll probably start my next project in around another 6-12 months, and I am thinking that if UE5 is production ready around then I might be able to really save some time using Nanite.

I want to make a game that takes place in a city because there is tons of art for generic cities already available. I dont plan on hiring any environment artist to help, so the big task of creating tons of LODs for every model would be a real headache. It is looking like Nanite will pretty much negate that mountain of work. I can more or less just drop models in and LODs are completely automated.

I haven’t actually looked into it besides watching that one promo video, but it looks like this may be a case I can use new tech to actually save me work. We’ll see of course.

4 Likes

Is that video confirmed to be real?

This guy definatelly made some Unity stuff working

https://www.youtube.com/watch?v=0Zy9URHLbFw

Also, his previous vids showing his dynamic clothes system, based on GPU.
So definatelly there isn’t something pulled from thin air.

2 Likes

Source:

R.I.P Nanite, you’ll be remembered. (2021-2018)

2 Likes

Fixed that for u with the year Unity will add it in a workable state (?) :stuck_out_tongue:

2 Likes