DynamicBuffer in IJob with Parallel

I am truggling with DynamicBuffer in IJob.

Capture

And get error
Capture2

I dont understand why it show this error.
i know that every entity have individual DynamicBuffer
i only get this DynamicBuffer and pass it to Job.

==================================
I change to using BufferLookUp

Capture3

and i get error
Capture5

====================================
i change to BufferLookUp with ReadOnly

Capture6

now it work But i cant change DynamicBuffer in Job. :cry:

Cant Add buffer.
Capture8
Event cant modify BufferItem
Capture9

=====================================
Final solution i can make that i will convert DynamicBuffer to UnsafeList after pass to Job.

Anyone suggest me other solution?

image

Here, you’re scheduling the job without any dependencies (you’re ignoring the system’s Dependency property) and storing the JobHandle. You should schedule your job like this instead:

state.Dependency = new MyJob().Schedule(state.Dependency);

It isn’t usually a great practice to create a separate job for each entity individually. I think the job scheduling system is designed around the idea of doing work in batches. For example, you can use IJobEntity to schedule a single job that iterates over all entities, without copying the entities into temporary arrays. If you want to process entities from an array manually like you’re doing right now, you can schedule a single IJobParallelFor job instead.

Also note that you don’t need an array of EntityCommandBuffers - you can use a single EntityCommandBuffer and write all commands to it. In fact, maybe you don’t even need to create any command buffers if you just use one of the pre-defined ones (e.g. EndSimulationEntityCommandBufferSystem runs at the end of the simulation group). That should be more efficient, because you can get rid of the JobHandle.CompleteAll stall and let your jobs run without waiting.

2 Likes

Thank you for your helpful reply. you can explain a bit about System Dependency and IEntityJob.

i know System Dependency but I dont understand know to it work.
I think in one frame all System will Run and Complete in the frame. Even when i call MyJob.Schedule() then MyJob will Run and Complete in that frame.

When i use IEntityJob i see it only run in one thread.

Capture10

if i have 3 CubeTest Component
It run like this.
1,2,3,4… 1,2,3,4… 1,2,3,4
It seem have to wait previous job complete before run other job.

Unity puts your entities in chunks. Each chunk can fit up to 16kb of memory, or up to 128 entites, whichever is lower.

IJobEntity internally uses IJobChunk, and IJobChunk’s parallelism is over chunks, not entities. This means that if all of your entities fit in a single chunk, all entities will be processed by a single thread. This isn’t a huge problem in practice - 99% of jobs are very fast, and thanks to efficient cache usage processing a whole chunk of entities can be almost as fast as processing a single entity individually!

When your game gets larger and there’s more/bigger entities, the workload should be spread across all cores automatically. If you really need to, you can force your entities to live in different chunks, e.g. using shared components.

Also, I recommend using the profiler instead of logs for observing job behavior, it’s much more convenient and you can do it at any time without modifying your code. It looks like this:

This is true, but for this to work correctly you need to assign your jobs to the Dependency, and include the Dependency when scheduling your jobs, so that all of the jobs scheduled during the frame form a long chain. The game calls Complete() on the dependency at the end of the frame to ensure that all of the jobs in the chain have finished running.

2 Likes

thank your reply. i change to use IJobParallelFor. it work perfectly. and i also change to using
EndSimulationEntityCommandBufferSystem and remove JobHandle.CompleteAll().


Capture12

but you can explain abit about innerloopBatchCount in IJobParallelFor.Schedule.

document say

Batch size should generally be chosen depending on the amount of work performed in the job. A simple job, for example adding a couple of Vector3 to each other should probably have a batch size of 32 to 128. However if the work performed is very expensive then it is best to use a small batch size, for expensive work a batch size of 1 is totally fine.

I can alway set innerloopBatchCount = 1 to it alway run on all thread.
Or something like innerloopBatchCount = entities.Length / TotalThread (but i dont know to get this value)

As far as I know, the innerloopBatchCount parameter is something you should tweak once you have a good idea about how heavy the workload is, and how many indices (entities) it runs on.

A small innerloopBatchCount means each worker “steals” a small number of iterations to execute. This is OK for jobs that do a lot of work (e.g. lots of math), but stops being efficient when your job is lightweight.

This is because scheduling and executing jobs has an overhead. It doesn’t matter much for most jobs, but when a job does barely any computation, as in the example of “adding a couple of Vector3”, the job system might be wasting a lot of time compared to the amount of actual work it’s doing. In that case, a bigger innerloopBatchCount will let the worker threads execute more job iterations before it needs to talk to the job system to “steal” the next scheduled workload.

I usually just use a value of 8 for heavy jobs, 64 for lightweight jobs, put in a // todo and worry about it later. I don’t think you can predict the optimal setting at the beginning of the production, before your game has a representative number of entities, and without measuring/profiling the changes. So don’t worry about it too much! It doesn’t make such a huge performance difference, and you can change it later without breaking anything.

You can use this:

Unity - Scripting API: Unity.Jobs.LowLevel.Unsafe.JobsUtility.JobWorkerCount

2 Likes