I have 2 systems (let’s call them A and B) where System A processes some data and stores it on a native array that will be accessible from other systems (System B in this example). System B will update after System A and reads from System A native array.
I’m getting an exception complaining about the need of calling complete on the jobs that read from the native array.
InvalidOperationException: The previously scheduled job SystemB:ReadJob reads from the NativeArray ReadJob.Data. You must call JobHandle.Complete() on the job SystemB:ReadJob, before you can deallocate the NativeArray safely.
If i’m correct about the error is that System A is deallocating the native array but that only happens on the next frame and feels like there shouldn’t be a problem.
Is it possibile to achieve this without calling complete() on those jobs that read from the System A native array?
Here are my systems: System A
public class SystemA : JobComponentSystem {
public NativeArray<int> ProcessedValues {
get;
private set;
}
public JobHandle InputDeps {
get;
private set;
}
Random m_random;
protected override void OnCreate() {
base.OnCreate();
m_random = new Random((uint)System.DateTime.Now.Millisecond);
}
protected override void OnStopRunning() {
base.OnStopRunning();
if (ProcessedValues.IsCreated)
ProcessedValues.Dispose();
}
[BurstCompile]
struct ProcessJob : IJobParallelFor {
[WriteOnly]
public NativeArray<int> Data;
public int Seed;
public void Execute(int index) {
var rnd = new Random((uint)(Seed + index));
Data[index] = rnd.NextInt();
}
}
protected override JobHandle OnUpdate(JobHandle inputDeps) {
if (ProcessedValues.IsCreated)
ProcessedValues.Dispose();
ProcessedValues = new NativeArray<int>(1000, Allocator.TempJob);
inputDeps = new ProcessJob {
Data = ProcessedValues,
Seed = m_random.NextInt()
}.Schedule(ProcessedValues.Length, 64, inputDeps);
InputDeps = inputDeps;
return inputDeps;
}
}
System B
[UpdateAfter(typeof(SystemA))]
public class SystemB : JobComponentSystem {
SystemA m_systemA;
protected override void OnCreate() {
base.OnCreate();
m_systemA = World.GetOrCreateSystem<SystemA>();
}
[BurstCompile]
struct ReadJob : IJobParallelFor {
[ReadOnly]
public NativeArray<int> Data;
public void Execute(int index) {
// Do something
}
}
protected override JobHandle OnUpdate(JobHandle inputDeps) {
inputDeps = JobHandle.CombineDependencies(inputDeps, m_systemA.InputDeps);
inputDeps = new ReadJob {
Data = m_systemA.ProcessedValues
}.Schedule(m_systemA.ProcessedValues.Length, 64, inputDeps);
// Calling complete here fixes the problem
//inputDeps.Complete();
return inputDeps;
}
}
It might not be relevant, but there seems to be some confusion between similar variable names and you keep overwriting the input param of the OnUpdate method
Try this on SystemB:
protected override JobHandle OnUpdate(JobHandle inputDeps) {
//inputDeps.Complete(); // Why do you need this?
if (ProcessedValues.IsCreated)
ProcessedValues.Dispose();
ProcessedValues = new NativeArray<int>(1000, Allocator.TempJob);
InputDeps = new ProcessJob {
Data = ProcessedValues,
Seed = m_random.NextInt()
}.Schedule(ProcessedValues.Length, 64, inputDeps);
return InputDeps;
}
Also in SystemA:
protected override JobHandle OnUpdate(JobHandle inputDeps) {
var combinedInputDeps = JobHandle.CombineDependencies(inputDeps, m_systemA.InputDeps);
var jobHandle = new ReadJob {
Data = m_systemA.ProcessedValues
}.Schedule(m_systemA.ProcessedValues.Length, 64, combinedInputDeps);
// Calling complete here fixes the problem
//inputDeps.Complete();
return jobHandle;
}
That is not a real solution. The safety system is telling you that there is a race condition.
Disabling the safety system so it stops telling you about it doesn’t solve the problem.
If you have a manager that has a native array that is used by other systems. Then you have to register the job handle of every job that is writing / reading from it. And when destroying / accessing it on the main thread depend on all of those jobs. And when scheduling a job that writes to it, depend on all of those jobs.
I made up a simple abstraction for managing shared dependencies. The native containers are declared and created/destroyed in the concrete implementation. With a dedicated ComponentSystem that creates/destroys those implementations and makes them available via public fields.
So as per @Joachim_Ante_1 's suggestion in another post, you call Combine before you schedule the job that uses a shared dependency, and after it call Set. Complete is for main thread access.
public abstract class SharedDependency
{
private JobHandle JobHandle;
public void Complete()
{
if (!JobHandle.IsCompleted)
{
JobHandle.Complete();
}
}
public JobHandle Combine(JobHandle other)
{
return JobHandle.CombineDependencies(JobHandle, other);
}
public void Set(JobHandle other)
{
JobHandle = other;
}
public abstract void OnCreate();
public abstract void OnDestroy();
}
Another solution is just to schedule all your dependent jobs in one place, that way you can easily pass in whatever they need as you schedule them without passing your data all over the place. For my game I have a single “board” (NativeArray) and I was originally trying to pass it between a bunch of systems along with the job handle. It never really felt right.
Eventually I realized (in my case at least) it was pointless to have all these separate systems operating on the same data. I broke out all the jobs from each system into their own file to keep things clean then I just schedule them all in order in a single BoardSystem:
protected override JobHandle OnUpdate(JobHandle inputDependencies)
{
// Clear the board
for (int i = 0; i < board_.Length; ++i)
board_[i] = Entity.Null;
for (int i = 0; i < heightMap_.Length; ++i)
heightMap_[i] = 0;
var boardJob = inputDependencies;
boardJob = new InitializeBoardJob
{
board = board_,
childLookup = GetBufferFromEntity<Child>(true),
tilesLookup = GetBufferFromEntity<PieceTiles>(true),
}.Schedule(this, boardJob);
boardJob = new BuildHeightmapJob
{
heightMap = heightMap_,
tilesLookup = GetBufferFromEntity<PieceTiles>(true),
}.Schedule(this, boardJob);
And so on...