I have an algorithm that goes through an existing list of items/nodes on a 2d grid, makes conditional checks, and assigns a calculated value between 0 and 5 to each item that qualifies, and removes the ones that don’t. Each item knows the 4 neighboring items, so it keeps performing these conditional checks until no neighboring nodes qualify.
In psuedo-code it looks something like:
newNodes = new List;
discardedNodes = new List;
newNodes.AddRange(myNodes);
myNodes.Clear();
while(newNodes.Length > 0)
{
node = newNodes[0];
if (!myNodes.Contains(node) || !discardedNodes.Contains(node))
{
if (nodePassesConditionalChecks)
{
node.value = calculatedValue;
myNodes.Add(node);
foreach(var neighbour in node.neighbours)
{
if(!myNodes.Contains(neighbour) && !discardedNodes.Contains(neighbour)
{
newNodes.Add(neighbour)
}
}
}
else
{
discardedNodes.Add(node)
}
}
newNodes.Remove(node);
}
foreach (var node in myNodes)
node.Method1();
foreach (var node in discardedNodes)
node.Method2();
The code works (in single thread), but since the conditional checks are somewhat expensive, count of operations somewhat high, and is repeated somewhat frequently, it has a noticeable effect on framerate.
So I’d like to use the job system, but I’m not sure how to pull it off, or if it’s even possible.
The two issues I see:
A. Each loop reads/writes to the same lists (myNodes, newNodes & discardedNodes)
B. There is no way to predict how many iterations will be required, so can’t use IJobParallelFor
The only solutions to the two issues I can think of:
A. Skip (!myNodes.Contains(node) || !discardedNodes.Contains(node)
and check for duplicates in main thread after Job is complete.
B. Split the loop into multiple steps, to allow for IJobParallelFor, like: 1. Check list in one job and create list of new neighbours 2. (next frame) check neighbours, and for the valid ones, add their neighbours 3. Repeat previous step x amount of frames.
Each solution is very flawed, so I’m looking for a smarter solution. Any ideas?