I’ve fixed the problem. In the Transforms package in ParentSystem.cs it sets the size of the UniqueParents hashmap to the count of existing parents. However in the code of the job using this hash map it can actually add to this hashmap twice. The error always triggers for me when I have one entity with a parent and swap parents around, but only sometimes triggers when more parents are being changed. Details below.
var count = m_ExistingParentsGroup.CalculateEntityCount();
if (count == 0)
return;
// 1. Get (Parent,Child) to remove
// 2. Get (Parent,Child) to add
// 3. Get unique Parent change list
// 4. Set PreviousParent to new Parent
var parentChildrenToAdd = new NativeMultiHashMap<Entity, Entity>(count, Allocator.TempJob);
var parentChildrenToRemove = new NativeMultiHashMap<Entity, Entity>(count, Allocator.TempJob);
var uniqueParents = new NativeHashMap<Entity, int>(count, Allocator.TempJob);
var gatherChangedParentsJob = new GatherChangedParents
{
ParentChildrenToAdd = parentChildrenToAdd.AsParallelWriter(),
ParentChildrenToRemove = parentChildrenToRemove.AsParallelWriter(),
UniqueParents = uniqueParents.AsParallelWriter(),
PreviousParentType = GetArchetypeChunkComponentType<PreviousParent>(false),
ParentType = GetArchetypeChunkComponentType<Parent>(true),
EntityType = GetArchetypeChunkEntityType(),
LastSystemVersion = LastSystemVersion
};
var gatherChangedParentsJobHandle = gatherChangedParentsJob.Schedule(m_ExistingParentsGroup);
gatherChangedParentsJobHandle.Complete();
The actual job itself has this loop in it:
public void Execute(ArchetypeChunk chunk, int chunkIndex, int firstEntityIndex)
{
if (chunk.DidChange(ParentType, LastSystemVersion))
{
var chunkPreviousParents = chunk.GetNativeArray(PreviousParentType);
var chunkParents = chunk.GetNativeArray(ParentType);
var chunkEntities = chunk.GetNativeArray(EntityType);
for (int j = 0; j < chunk.Count; j++)
{
if (chunkParents[j].Value != chunkPreviousParents[j].Value)
{
var childEntity = chunkEntities[j];
var parentEntity = chunkParents[j].Value;
var previousParentEntity = chunkPreviousParents[j].Value;
ParentChildrenToAdd.Add(parentEntity,childEntity);
UniqueParents.TryAdd(parentEntity, 0);
if (previousParentEntity != Entity.Null)
{
ParentChildrenToRemove.Add(previousParentEntity, childEntity);
UniqueParents.TryAdd(previousParentEntity, 0);
}
chunkPreviousParents[j] = new PreviousParent
{
Value = parentEntity
};
}
}
}
}
I can reproduce the bug every time if there is only one Entity with a parent being changed. It is a little more chance based after that.
var count = m_ExistingParentsGroup.CalculateEntityCount();
If this returns a count of 1, then the UniqueParents hashmap is set to a size of 1. However the code of the job allows for adding two entities to UniqueParents here:
UniqueParents.TryAdd(parentEntity, 0);
if (previousParentEntity != Entity.Null)
{
ParentChildrenToRemove.Add(previousParentEntity, childEntity);
UniqueParents.TryAdd(previousParentEntity, 0);
}
If i double the size of the UniqueParents hashmap I don’t get the error any more.