If I'm not mistaken, Unity's vertex arrays use floats. Is there any way to use a different data type?
Some of my meshes store vertex data that could easily be represented with shorts instead. The meshes take up a huge amount of memory, as they have many vertices. While I could use a mesh optimization algorithm, it seems like using smaller datatypes would be the first thing to check.
(Actually, they may even be able to be represented as bytes, but not sure if opengl/directx supports that).
I believe most 3D programs produces Vector3D coords with floats or bigger variables.
Perhaps you could rethink the meshes for low-poly and level of details (LOD) ?
Is it only a memory problem or a performance problem while rendering a frame? How big are the meshes? how many poly/vertexes are we talking about?
Unity already does this...you can set mesh compression to low, medium, or high in the import settings. Though the best thing to do is reduce the number of vertices used in the first place.
What about using instanced geometry instead of explicit mesh? You could go beyond a single cube, and determine a few patterns that repeat (2x2, 3x3, 4x4 as easy examples, but there are no doubt many, many repeating patterns to be found). You'd probably trade some CPU for the transform/material switch on these, but if memory's what you're after, it would make sense to investigate this route.