Why can't Unity HDRP correctly render AAA looking character models?

There’s a good working Daz to Unity bridge, which also allows to export with subdivision, has HDRP shaders already etc.
I guess DAZ is currently best option from ratio of cost to quality.

1 Like

I was at a Unity Developer Day recently where Mark Schoennagel gave a presentation very similar to the one in this video. However, there was another part of the presentation that I don’t notice when quickly scrubbing through this version I just found on YouTube. He also had Maya open (I believe). He was making changes to the face rig in Maya and the changes were all updating live in the Unity editor. I’m not 100% sure if this is the same kind of thing you’re looking for, but maybe it’s a clue that can point you in the right direction.

https://www.youtube.com/watch?v=QaeHwCnmQ60

I think it might also have something to do with Ziva, and a Metahuman face rig in Maya. I’m sorry that my knowledge is fuzzy in this area. However, Mark Schoennagel knows how to make live changes to the Metahuman face rig in Maya that show up immediately in Unity Editor. (If I’m remembering his recent presentation correctly.)

https://www.youtube.com/watch?v=Su7b89V3wm0

Sorry if any of this was already addressed previously in this thread. I just noticed the comment about face rigging tools within Unity, and I remembered seeing something sort of like that. It doesn’t seem like any of the current YouTube versions of Mark’s talk have the part that I saw though, I’m not sure why.

Unity did make a tool called Live Capture a few years ago, which has a Take Record system. However, the face & head have to be rigged to use it. There’s also some tutorials on Youtube for it as well:
https://docs.unity3d.com/Packages/com.unity.live-capture@3.0/manual/index.html

Overall, Digital Human and Ziva are well mentioned in the thread, but no worries on mentioning it. The situation with both of those is that the workflow pipeline is too exhaustive for the larger community to use. Unity seems to couch Digital Human as a proof of concept rather than a system that’s extensible enough to work with model formats that most of the community uses, like FBX. Ziva is something that very, very few people are using, it also costs $1800 per face to train. There’s definitely a lack of a community step-by-step tutorial from taking a model to a complete Digital Human or Ziva trained model all the way into Unity. When I pressed Unity on that (before the layoffs) they were aware and acknowledged that DH was very much “garage shop work” and Ziva was more geared for larger film studios like Warner Bros.

They also had a Pulse feedback survey and were hinting around building a “library of ready use characters” with the some of the same features as MetaHumans, without mentioning the word MetaHumans.

2 Likes

I’ve been watching the Enemies demo EXTENSIVELY and trying to update my pipeline to allow this kind of visual fidelity, especially for facial performance capture (I’m building my own HMC for this very reason). Luckily, I’ve been able to reverse engineer some advanced facial rigs and break it down into easy to build components (such as FaceIt in Blender which is AMAZING). It’s just sad that we are left to build it ourselves, while UE5 has a built in tool for this.

It’s a shame that Unity stopped working on Live Capture, because they could easily keep updating it to use ML to REALLY push it and make it easy to use. The best I can do to get even close to clean facial animation is hand authoring my character faces.

These looks pretty damn cool. The eyes look a little funky on the Braveheart dude, but everything else look great. I spent many weeks building shaders for Skin, Hair, and Eyes using Amplify, and the results are CRAZY!!

2 Likes

Does anyone know how MetaHuman generates their facial models? Does an AI generate them? I mean, true AI generation from scratch?

I have a feeling we will see some amazing AI generated full body, full rig, full texture, full everything for any game engine soon enough.

I came across the idea that AI generated textures for assets alone are pretty nice today, and I will be looking for an AI service to generate most or all of my asset textures for my game moving forward.

Metahuman faces are based on real life scans of people and they can be blended together to create different faces. And now you can bring in your own character heads and it will smoothly integrate it

One big theme I noticed from Developer Day was a push toward AI integrations in the Unity editor. They are looking to add the option to just type requests into a text field and have Unity create assets for you with AI. As a non-artist I was particularly interested in some of the AI generated textures and models in the demos. It would be nice for me to be able to just request a cobblestone texture right in Unity and not have to search elsewhere.

You could also create such textures in Substance Designer :smile:

2 Likes

This was what I was seeing in a video and they generated a brick texture using the ChatGPT. I wouldn’t mind if the artwork was more AI than photo realistic, because it would be unique in and of itself. Just so long as all textures are the same style, it would work.

Wow, that is very nice

Thank you, I am still using the Live Capture as DAZ has an export option for the export of the ARKit blend shapes that works pretty good, I do all my own face acting and get the voices from a voice generator company. Here’s something I’m currently working on running in the Unity Editor, talking starts about 3 seconds in (this is on an old 1060 NVidia machine and still runs at nearly 60fps built in 1920x1080, about 40fps in the editor at 1920x1080)…

https://www.youtube.com/watch?v=GBWpbk4i8dU

3 Likes

I hope to take this even further with motion capture and a full facial performance

Unity just put this out today, looks like generative AI characters in a new tool called Unity Sentis. Very few details available, looks very similar tech to D-ID. Could be of use in certain gameplay situations:

There’s a forum for it, but it’s got permission errors today:

A few months ago I was experimenting with midjourney, d-id, and replica for voice:
https://i.imgur.com/7NoNx9g.mp46sbsqr

Speaking of AI… It’s also worth mentioning that Replica (Voice AI) has a plugin for Unity, but also has a plug-in for iClone 7 (v8 wasn’t working a few months ago, but they may have updated it) which does the lipsync with the generated audio. The Unity plugin is audio only and doesn’t do any of the 3D work.

2 Likes

Some update on the long journey to AAA looking characters; now including special AO solution of Unity for Teeth / Mouth, partly simulated lashes and brows, and some self made AO using SDFs. I really hope Unity manages to repair the new Line Renderer in 2023.1 in order to have really good looking hair; this will be a huge step into more realistic looking characters:

3 Likes

One of the interesting things is that once you get past making characters light / look better, it is then harder still to get the lighting in the whole environment to work with the character lighting so the character still looks good and the sun / other lighting on other things / characters also looks good together

1 Like

Very true, due to Unity splitting the pipelines, they lost years of effort on making significant improvements on realtime global illumination. Ideally they’d move to SDF GI, which would really improve lighting to be more realistic. This would drastically reduce the complexity in unity’s current system of a mish-mash of configurations to balance lighting, exposure, shadow, and ambient.

1 Like

This right here is why I have come to the conclusion if I want to make a photo realistic game experience, I should just use Unreal. Unity is more for games that are focused on and rely upon the game and not the graphics. I am not saying everyone should adopt my view, but this is my view.

one solution to that would be per character lighting, or hero lighting like is used in the Horizon game. It’ll make your character stick out from the environment but at least you know their lighting looks good all the time.

1 Like

Now let’s see if this works on my own custom characters

Thank you, yes you can just darken/fade the background on character talking / close ups and concentrate on the character lighting. When not up close on characters on full scenes it takes a lot of trial and error to get everything looking reasonable, I’m almost there but never quite satisfied :wink:

1 Like