Hi,
I have a really big 3D model (around 50 Go) and I want to import it into Unity to work with.
When I try to do it, Unity crash. Is there a way to import huge amount of data like this in a Unity Project ?
If there is no, do you know a way to split/partition my 3D model in parts ? (I tried with Blender et Meshlab but I couldn’t even import my file into these softs…)
My version of Unity is 2021.3.6f1 and my computer has a GeForce RTX 3060. Could a solution be to work with a much more powerful machine or it wouldn’t change anything?
what format it is, how many polygons (if you know roughly) and from what application its coming from?
Could try Pixyz, but i’d guess it might also hang…
Best if could export in smaller pieces from the original software.
Even with better machine (and if unity doesn’t crash),
the import time for such model could be tens of hours, if not several days.
(and using so large single model inside unity would be too slow anyways)
It’s an MNS file (Geo Data). I dont know for the number of polygons but I also have to work with a pointcloud representing the same type of 3D model and it’s about 3 billions of vertex.
For now I don’t have the hand on exporting this model in smaller pieces, but it will may be a solution.
Yes, when i try to import it, it’s an fbx format.
I’ve recently try with another smaller file (but still 1.5 Go !) and it makes Unity crash too.
Do we know if there is a limit file size for importing file in Unity ?
unity is a game engine and it needs to load the files into the gpu, if your file is 50gb it will never “fit” inside the GPU memory.
what you need is to find or write yourself saome functionality to load and unload data on the fly, basically having small pieces of that huge data assets loaded at any time similar with how various modern games work, for example GTAV or farcray or others like that. Or how google earth works.
If the camera inside unity is looking at the entire “object” it needs to be a lower data representation of that object. Once you get really close you can see that part of the object in high detail, but the rest of the object will not be loaded and in memory.
the application from where you got that data, does exactly what is described above, streaming pieces of information depending on how you look at it.
I suggest looking at unreal 5, they have this streaming tech that deals with huge levels that have a lot of terrains and data. but same as unity it will not work out of the box, it needs some custom code behind.
maybe you get lucky and someone did write something like that on github, for unreal or unity.
Let’s talk about the round back-of-envelope numbers you’re talking about here.
Assuming single-precision floats, which Unity uses, each vertex position is three floats, so 12 bytes. Therefore 3 billion verts is going to be 36 gigabytes of RAM just to store the vertex position data.
Does this data have UVs? Then add another 8 bytes per vert, or 24 more gigabytes of RAM.
Does this data have normals? Then add another 36 gigabytes for normals.
Does this data have color at the verts?
Let’s assume each trio of verts are used to make one triangle. Unity uses 32-bit vertex indices but they are signed, so you can only reference up to 2gb of verts (2 billion). Let’s move up to 64-bit vertex indices. I don’t think Unity even handles that format.
That is 1 billion triangles, at 8x3 (24 bytes) per triangle, so 24 gigabytes of RAM just to hold the triangle data.
That is what it will consume after the file is successfully opened and parsed.
Let’s pretend the original file is stored as a txt .OBJ file. Here is a simple example of a single triangle OBJ file:
# www.blender.org
mtllib onetri.mtl
o Cube
v 0.733116 -0.279275 -1.315588
v -0.482921 1.695729 -0.996758
v -1.065531 0.789610 1.439165
vt 0.000000 0.000000
vt 0.000000 0.000000
vt 0.000000 0.000000
vn -0.8188 -0.4458 -0.3617
usemtl Material
s off
f 2/1/1 1/2/1 3/3/1```
The above stores each vertex position as about 30 bytes of data. This means about 90 gigabytes of RAM to store just the vertex data while you are parsing it down into the 36gb of usable vertex position data.
The OBJ stores the UV data as 20 bytes per vertex and normal is shared in the above case (the vn) line, but once loaded it will have to be copied three (3) times, one for each vert it applies to.
I lost track but we're already talking about hundreds and hundreds of gigabytes of storage here.