I’m having a hard time getting lightmaps to look even remotely decent. The lighting is completely scattered but seems to repeat where the mesh repeats.
I assume it has something to do with the generated lightmap UVs. I can’t use the normal “checkbox” way of generating the UVs because the meshes are generated and saved directly to file, so I use Unwrapping.GenerateSecondaryUVSet.
I can’t seem to interpret the UV Charts to know if it looks wrong. It would help if I knew what a correct UV Chart looked like.
I tried clearing uv2, hoping that they would be auto-generated, and got similar results.
I’m really out of ideas here. All I want is direct lighting without the cost of 200 real-time lights. I don’t care at all about bounces and AO and whatever.
Is it really impossible to use lightmaps with a basic Unity Mesh? I have to have some external model type to use lightmaps?
I think you have to save the asset after you assign the UV’s. Best thing to do is create another script to check the first 20 UV2’s and see if they are set before and after baking, I have a feeling because you are not saving the Asset it’s not keeping them.
Not sure what you mean to check that the lightmap exists through logs. The snapshot looks like this though, which seems like it is taking up too little space in the atlas, but maybe I’m wrong (baked at 5 texels per unit, tower is about 3x6 units on one side).
// Get the sub filters to combine
MeshFilter[] meshFilters = GetComponentsInChildren<MeshFilter>(false);
CombineInstance[] combine = new CombineInstance[meshFilters.Length];
Material sharedMat = null;
int i = 0;
while (i < meshFilters.Length)
{
combine[i].mesh = meshFilters[i].sharedMesh;
combine[i].transform = Matrix4x4.TRS(meshFilters[i].transform.localPosition,
meshFilters[i].transform.localRotation,
meshFilters[i].transform.localScale);
//combine[i].transform = meshFilters[i].transform.localToWorldMatrix;
//meshFilters[i].GetComponent<Renderer>().enabled = false;
if (sharedMat == null && meshFilters[i].GetComponent<Renderer>() != null)
sharedMat = meshFilters[i].GetComponent<Renderer>().sharedMaterial;
i++;
}
// Combine the meshes
Mesh mesh = new Mesh();
mesh.CombineMeshes(combine, true, true);
// Write the mesh to file
string szPath = "Assets/GeneratedMeshes/" + name + ".asset";
AssetDatabase.CreateAsset(mesh, szPath);
AssetDatabase.SaveAssets();
I tried generating the UVs with and without AssetDatabase.SaveAssets(); and it doesn’t make a difference. The preview window of the mesh shows the “uv2” text either way. I’ve also taken the uv2 out with similar results.
That was more of an oversight. Once all of the LODs were generated and performance still wasn’t fast enough, I needed a quick way to add the lightmap UVs.
I just did a quick test with generate immediately after combine… and it works, with the same lightmap texture even.
Ehh… OK, is there something wrong with the CreateLightMapUVs function?
EDIT:
Instead of saving to the existing model, I created a new one, but it still breaks. It’s like the GenerateSecondaryUVSet function only works when the mesh is new.
This still presents a problem. The LODs are generated by a closed source application. I don’t have a choice but to add the UVs after the mesh file has been created for those.
Not in my experience. But we don’t know exactly what you are doing and in which order. I didn’t have issues with the lightmap computation in Unity so far, but I may be using it differently.
I didn’t mean Unity’s UV Generator. I meant the function I posted above.
Basically, all I’m doing is the second posted code (combine), followed by the first posted code (generate UVs), and then generating the light map with default parameters.
It seems super finicky. If I build the light map with the LOD1 mesh, it displays incorrectly, but if I bake it with the newly minted LOD0 mesh and swap it with LOD1, it looks correct. If I then bake it, it gets messed up.
I think my only solution now is to recombine everything with the uvs generated right away, regenerate the LODs and bake on the first level, hoping for the best.
Yeah, basically, LOD share the lightmap, you’re ideal situation is to create your own UV2’s and LOD and ensure they fall into the same coordinates…given the tool you are using, can you not create the UV2’s for LOD0, then export this as an OBJ then LOD that?