Hello, I’m looking for a multidimensional container to store a bunch of float3’s for pathfinding purposes.
Constraints:
- Has to work in a bursted job.
- Has to support different numbers of values per key
- Amount of keys can be constant. My entities can never get destroyed or created after initializing.
- Has to be deterministic.
- Has to be fast
Here is what I’ve tried so far:
-
One big static NativeMultiHashMap with keys being entity indexes. Though this is the fastest option, it fails spectacularly when adding/ removing waypoints to it. The order of waypoints keeps changing at random since it’s not a deterministic container. This was confirmed by mike_acton here: NativeMultiHashMap order . Sadly, I only found out about that thread after having written out the system and testing it out for myself.
-
Dynamic buffers. This is an option I tried a few weeks ago when I knew almost nothing about ECS so I didn’t document this and maybe I didn’t do it the proper way back then. This satisfied all constraints except number 5 because from what I remember, GetBufferFromEntity used to be quite slow for tens of thousands of entities. I might have to try it out again with all the knowledge I gathered about ECS. Though I have to say that redesigning my pathfinding system for a 4th time is putting me down, so that’s why I’m posting this! I also have not found a way to use a Dynamic Buffer without using the Reinterpret method on them, which is an experimental feature that might of course get removed.
-
One big static List<List> with the first dimension being all entitiy indexes in order. Since I can never create/destroy entities and if I create my 40k units initially, then my list from 0 to 40k will always correspond to the right entity. The only problem with this method is writing a proper job to copy the data from that big list onto a NativeArray with proper offsets. Here’s the job:
struct CopyDataJob : IJob {
[WriteOnly] public NativeArray<float3> Waypoints;
public NativeArray<int> waypointsLengthPerEntity;
public NativeArray<int> slicesStart;
public void Execute() {
executeWaypoints();
}
void executeWaypoints() {
slicesStart[0] = 0;
int c = PathfindingSystem.Waypoints.Count;
int k = 0;
int prevSlice = 0;
int prevLength = 0;
for (int i = 0; i < c; ++i) {
List<float3> ways = PathfindingSystem.Waypoints[i];
int cc = ways.Count;
if (cc == 0) {
slicesStart[i] = prevSlice;
continue;
}
for (int j = 0; j < cc; ++j) {
Waypoints[k] = ways[j];
++k;
}
waypointsLengthPerEntity[i] = cc;
if (i > 0)
slicesStart[i] += prevSlice + prevLength;
prevSlice = slicesStart[i];
prevLength = waypointsLengthPerEntity[i];
}
}
}
Then in the job with entities, I get the waypoints corresponding to that entity as follows:
NativeArray<float3> waypoints = new NativeArray<float3>(Waypoints.GetSubArray(slicesStarts[entity.Index], waypointsLengthPerEntity[entity.Index]), Allocator.Temp);
The bottleneck here is the CopyDataJob job. It scales poorly when going over a few thousands of entities even if it’s done in a job. The job itself can’t be bursted since it’s making reference to the List<List> object and I also could not think of a way to parallize it. (I don’t even think it’s possible since the number of waypoints can differ per entity)
Does anyone have a solution that can scale up properly?