65535 vertices limit

I’m trying to bring some DAZ Studio characters into Unity, and I ran into an issue. Apparently, there is a 65535 vertices limit for a mesh:

Meshes may not have more than 65535 vertices at the moment. Mesh ‘FullTop_42769-skinInstance’ will be split into 2 parts: ‘FullTop_42769-skinInstance_MeshPart0’, ‘FullTop_42769-skinInstance_MeshPart1’.
UnityEditorInternal.InternalEditorUtility:ProjectWindowDrag(HierarchyProperty, Boolean)
UnityEditor.DockArea:OnGUI()

Anyone know if this limit will be going away anytime soon?

The character I’m importing doesn’t look right, and I suspect it may be because it is being broken into multiple meshes.

Thanks.

I know nothing of DAZ, but have encountered the vert limit and experienced the mesh splitting before :slight_smile: perhaps you could reduce the detalis a bit on the model? a quick google gives me this but as stated I know nothing of modeling :wink:

The limit has been in place for years.

1 Like

There is still an issue in the mesh importer in Unity as it will split meshes with less than 65535 vertices, for some odd reason that no one at Unity seems to understand they split meshes based on the number of tris as well can be seen in the Unity post at BUG? Unity splits meshes on import it does not need to! - Unity Engine - Unity Discussions Would love to know why Unity has to split meshes when it has more than 65535 tris?

2 Likes

You need to use the Decimator tool from Daz to lower the poly count before bringing it into Unity. You’ll most likely want to use the Texture Atlas tool they offer as well, to help cut down on the material count. There are a few other things you’ll want to do if you plan on using Daz characters in your game. They’re not really made for straight up importing.

3 Likes

I could have sworn I heard somewhere that in Unity 5 this limit would be gone, but I know I’m still getting that limit (I know this isn’t for Unity 5).

But I heard somewhere that Unity5 would get rid of this limit. But now idea when lol.

I hear there’s also a limit on texture sizes as well.

"65535 occurs frequently in the field of computing because it is the highest number which can be represented by an unsigned 16-bit binary number. Some computer programming environments may have predefined constant values representing 65535"

It sucks because everybody knows that the only difference between amateur and professional games are things like poly counts, texture sizes, frames per second, input response delays and, of course, anti-aliasing algorithms.

1 Like

Heh this looks like their mesh importer still stuck on 16 bit era for some reason… where 64 is standard nowaday…

Maybe unity could consider replacing old importer for, let’s say, assimp? this is written in c++ and have c# wrapper support various formats and close all mesh features like blend shapes, in addition have export feature which would be cool for some plugin developers.
Only problem is that mesh representation in code is completely different which may cause backward compatibility issue but this can be solved via ‘bridge’ which could convert assimp representation to unity version…

Uh huh. Graphics capabilities are certainly a difference. But not the only one. And possibly not even the most significant one.

2 Likes

Um, so unity can represent vertex indices for a mesh that has less than 65536 vertices using 16-bit ints. If you have a mesh with 65536 verts, then the mesh index will need to be stored as 32-bit ints, doubling the memory size used for the mesh indices. That may not bother any of you, but keeping the memory size down was a design goal. If you do submit a mesh with more than 65535 verts, Unity splits into a number of meshes. This happens automagically, so there is nothing in Unity that means you cannot have high poly models. (There is a similar limit on the number of triangles, but I forget the value, and how it’s imposed.)

6 Likes

@RJ-MacReady - You won’t impress any professional developer with a high vertex count, rather the opposite. The true definition of a AAA artist is someone that can push realistic models with the lowest vertice count, texture size etc possible.

Here is an example:

Anyone that is curious can also Google the main character from Witcher 2… think his name was Geralt (?). I remember that they made an image breakdown of the model that showed the quads/triangles, bump, normal and diffuse etc… It looks great and is also far away from braking the vertice limit in Unity.

(Hope the qoute tag thingy is working)

9 Likes

I was thinking the same when I was reading Misterelmo’s comment earlier but then while replying to it the whole sentence started to sound like a troll bait and I canceled. With so many “AAA” games being fps locked down to 30 or high poly being AAA…

As a programmer even I can model decent models fast via sculpting but then the models will be really high poly and unusable in “real” games unless heavily optimized down which is a thing that I don’t master.

Here is some of the older AC characters in both views:

http://www.zbrushcentral.com/showthread.php?97394-Assassin-s-Creed-Brotherhood-Characters

That was what I though too, but I seldom work with models, so I wasn’t sure. I’d expect a bunch of models over this limit to start causing performance issues at some point.

There are already characters in games with up to 50 k tris. The old Unity demo had a soldier with 22k tris already. It’s just a matter of time when you hit the 65 k limit and you need to split the stuff. Nasty for a terrain mesh for example. Those lovely gaps …

The bottleneck is not the tricount anymore. Modern graphics cards can handle megapolys in an eyeblink. Shaders and code have ways more influence.

That said, wouldn’t it be nice to have 32 bit ints when you really want and need them? Would make certain tasks much easier. Especially those where you try to load, modify and save a mesh at runtime. Yes, i know, Unity is a game engine. But even here, see above :slight_smile:

Not only a game engine! ofc this is main goal everything else is a bonus but unity is used for something else too (archviz, customizing product via website for example) and there you more times need more than 65k limit due to often using procedural mesh creating in runtime which isnt handled by unity (i mean automatic splitting)

for example this limit is problematic for road/river tool (i tried create them for better understanding unity)- if you use relatively high poly density for smoother edge and have very uneven and long shape then this limit become problem very fast…

in addition as someone mentioned above splitting isnt perfect. As i think perfect would be new property/enum for mesh like for bones where you can specify that how many one vertex could have bones, so this will allow us to choose between less memory consumption and more polys& convenience in workflow

1 Like

Doesn’t Poser open Daz Models? I have poser game dev and it reduces polys while keeping the morph proportions and such. I’d look into it.

I think a bigger problem is why do you want to use models with more than 65k vertices? Is it a high end PC game or the feature in a complex visualization? If not, you need to reduce the poly count. There are tools in various 3d modeling programs for doing it. There are tools in Unity for doing it as well but it’ll split the original mesh beforehand when you import it, so you need to do it before it gets to Unity.

1 Like

@Graham-Dunnett - Can you explain why is there a split on tri counts, it makes no sense to me. I can make a procedural mesh with 65535 vertices and say 500,000 tris and Unity will render it fine so why does an imported mesh with only 40,000 vertices and 90,000 tris need to be split into multiple objects?

To build on this, not everyone uses Unity for gaming… I use it for engineering visualisation and removing the restriction makes a lot of sense when I’m dealing with survey data that generally gets broken up into 10 or more pieces due to this limit.

2 Likes

I already told one example. My last game used terrain meshes with up to 150 k tris. And there were visible gaps between the splitted parts. Gaps that wouldn’t be there when it would be one geometry. Floating point errors …

And even with characters we come more and more close to the 65k limit in the modern games.

Another example is everything where you import and export at runtime. I made a little tool to convert a greyscale image to a mesh. And had to use chunks, because i can easily produce megapoly meshes with it, which is a pain in the ass to handle. And that was already the stop point for the tool that i initially wanted to develop. This image to mesh tool was just a test case to see how far it can go.

A special 32 bit mesh filter would be a fine thing. That’s 4 billion vertices already. Or maybe even a 64 Bit one. That’s definitely future proof for the next two years then :smile:

@Ostwind - You are almost there if you got a high poly mesh done (sculpting or any other tool combinations), next step would be to “retopologize” the mesh so you get a low poly version of it. Followed by using both meshes to “bake” your “normal map” and any other textures that want or need (ambient occlusion map is a really good idea). Now, just paint the diffuse texture(s), connect all the pieces in Unity and you are ready to go…
So if you can do what you claim, I would highly recommend that you learn the last steps because you are probably 80% away from understanding and being able to produce HQ work :slight_smile:

@Tiles - Yes, there are games with bigger meshes but that doesn’t make the artist good… the only case where I see such models as acceptable during production is when you need to model all the clothes and hair separately so that you can hide data in the mesh for physics and advanced shading techniques. Those kind of cases makes it unavoidable that the body part as an example may have a chest, t-shirt and a jacket mesh covering the same area and thus increasing the overall vertex count.

Terrain meshes shouldn’t have gaps between the chunks… dunno if it was your own or a pre-made system but it wasn’t properly don in this case.This is even a perfect example when you don’t want a gigantic mesh but instead several meshes… you don’t want to waste resources on details that you can’t see anyways… it is a lot easier to reduce the geometry in the mesh when it is separated into several objects, you just poll for all meshes that are at a certain distance and then run a quick algorithm to reduce it 2, 4, 8 times in size… it is now also possible to have a greater view distance :smile:

Longer vertex arrays also got a pitfall, they are longer and thus take more time for the graphics engine to search and traverse… the extra milliseconds that are lost every now and then will add up sooner or later.

Also, the future of computer graphics isn’t more vertexes in the way you think… it is more vertexes through tessellation but that is handled by the graphics card and nothing that have a huge impact on 3D artists or their current vertex limitations.

Best example I could find during a quick search, know that there is better examples if you dig around:

@SpookyCat - Really sounds like some author of the importer treated all arrays like the vertex array and split them all up if they hit 65535 or maybe it is something funny with vertex groups going on?

1 Like