I’m having a similar issue, not sure if it’s related.
I have an entity with a RenderMesh which get rebuilt every frame using the standard shader, but it disappears if I move the camera ahead along the z axis at a certain point. Actually, once I get away from the origin point (0, 0, 0) it seems to disappear depending on the camera position and orientation. I always see the mesh in the Scene view however. No clue what’s going on, but sounds similar to the issue @Init33 is having.
I’ll also note, that it DOES render properly if I never rebuild the mesh after startup. So it seems to be some combination of rebuilding the mesh + camera position/orientation.
RenderBounds is only generated for you once so if you make changes to the mesh you need to update the RenderBounds yourself otherwise it will be culled at the wrong time.
It was null, silly me thought it would throw an error if it couldnt find the resource. I had to put the material in a /Assets/Resources folder and it worked fine.
Thanks! I was not calling calling mesh.RecalculateBounds(). This fixed my issue in 99% of cases, but I still have disappearing geo in some rare instances.
Just a quick follow up question: I still don’t see it changing the RenderBounds, just the WorldRenderBounds. Is that the expected behavior?
Nevermind, this is just the Inspector not updating. I printed the mesh.bounds in a debug statement and everything is working just fine and as expected. Thanks very much for the help!
From the camera’s Y-axis - 65deg onwards the entities disappear every time
What I did was first created 1600 entities and added them to a list of entities and called every 400 entities from that list time to time and changed the entity’s transform value and nonuniform scale value.
After reading few comments here I tried rather created 40,000 entities to avoid changing the RenderMeshBounds and still when the camera moves in a Z direction and changes orientation along y-axis beyond 65degs the entities disappear during the play mode
In my case, I am using a mesh from an import model. eg- Sphere mesh
Still, I added RecalculateBounds() before entityManager.SetSharedComponentData(entity, new RenderMesh
See code below
Yes, Did the same way you mentioned.
My code which I run using a signal emitter in the timeline (called only once)
public void generateInitialCluster()
{
for (int i = 0; i < 100; i++)
{
NativeArray<Entity> entityArray = new NativeArray<Entity>(400, Allocator.Temp);
entityManager.CreateEntity(entityArchetype, entityArray);
for (int j = 0; j < 400; j++)
{
Entity entity = entityArray[j];
entityManager.SetComponentData(entity, new NonUniformScale
{
Value = new float3(UnityEngine.Random.Range(2, 7.0f), UnityEngine.Random.Range(4, 8.0f), UnityEngine.Random.Range(3, 7.0f)),
});
entityManager.SetComponentData(entity, new Translation
{
Value = vPosition[j] //a List<Vector3>
});
mesh.RecalculateBounds();
mesh.MarkDynamic();
entityManager.SetSharedComponentData(entity, new RenderMesh
{
mesh = mesh,
material = material,
});
entityManager.SetComponentData(entity, new RenderBounds
{
Value = mesh.bounds.ToAABB(),
});
//entityList.Add(entity); //Adding all the entities to a List<Entity>
}
entityArray.Dispose();
}
}
Note - The above code is called only once. Still the problem occurs.
Not sure about this one. I don’t think you need the mesh.RecalculateBounds() inside the for loops though. If the mesh is moving/animating/deforming every frame then I’m not sure how that plays into everything either. Sorry, hopefully someone with more experience will help out!
I just had a thought… when your set your bounds, is there any chance that it has zero volume due to scaling?
I have a vague memory that I was trying to set the bounds on a quad and it didn’t work when it was aligned with the xy-plane because the bounds had zero depth. I fixed it by forcing the bounds to have at least some volume in each direction.
minPos = minPos + new Vector3(1.0f, 1.0f, 1.0f);
maxPos = maxPos - new Vector3(1.0f, 1.0f, 1.0f);
bounds.SetMinMax(minPos, maxPos);