Some questions about C# jobs, ECS, Burst, compute shaders

Hey everyone!

Documentation is rather sparse on those topics, greatly lacking colorful real world examples (especially for the topics on their own and for C# jobs combined with ECS) and some forum entries are rather vague and show considerable uncertainty.

So I’ve got some questions about C# jobs, ECS, Burst and compute shaders. Maybe some of you know some answers to those or some talks that actually show useful ways of integrating things like that. I am using LTS Unity 2018.3. I am new to intentional multi threaded programming and still trying to understand the usefulness and implications of that.

If I understand correctly, there is a manual way of implementing multi threading for CPUs. Are there any advantages of the basic, manual way over C# jobs or any restrictions when using C# jobs?

When using C# jobs for calculations on different cores, do those cores behave just like the core that uses the main thread?
For example, can I use a number of jobs running per frame for different usages on a number of cores?

Can I use a singular C# job to completely move certain ongoing calculations onto a different core (any core that currently isn’t busy)? For example, combat system calculations?

Can a C# job be a coroutine?

Does the ECS encourage not using GameObjects? I am greatly dependant on the GameObject workflow and I can’t imagine a useful use case for not using a GameObject or just use raw data, without any scene or component references at all.
Or are entities just a more efficient way to interpret and use GameObject data via script?

If I understand correctly, Burst is only useful when making use of the ECS, right?

How do compute shaders compare to C# jobs in terms of use cases and being able to implement those features easily? I understand compute shaders use a GPU, which likely uses quite a number of cores making it useful for graphics calculations.

When writing a custom AI system using a number of raycasts, would C# jobs make sense? Would compute shaders even work?

I have read about C# jobs taking quite some time to implement. Are C# jobs worth the development time or should I use .cpp for math heavy calculations?

I’ve also got two examples:

E1 (not tested if compiles, but it’s very similar to what I am working on):

```csharp
*public class Statics {

public static Transform Transform_GetClosest (Transform origin, Transform references) {int referencesLength = references.Length;
float currentDistance = 0.0f;
Transform closest = references[0];
float closestDistance = (references[0].position - origin.position).sqrMagnitude;
for (int i = 0; i < referencesLength; i++) {
currentDistance = (references.position - origin.position).sqrMagnitude;
if (currentDistance < closestDistance) {
closest = references;
closestDistance = currentDistance;
}
} return closest;
}

}
_
```_
10.
The for loop of that could run on different cores simultaneously, right?
11.
Is there a way and would it make sense to use a compute shader for that, using float3s for the world positions of the transforms, getting the index and therefore the index of the closest transform?
E2, 12.
I am currently using color arrays, populated by .GetPixels from textures and use those to calculate resulting textures, e.g. adding, subtracting, multiplying those colors. Right now, I am doing that on CPU, which is very slow. I am not intending to change that, unless it’s very easy to implement a more performant solution, but it would be interesting if, in such a case, for example, the usage of a compute shader would make sense.

EDIT:
13.
How about platform support? Are those features supported on consoles? (Nintendo Switch? PS4?)
*

Best wishes,
Shu

I suggest to watch this vid. Will answer number of your questions, which some are not trivial.

Unity at GDC - Job System & Entity Component System

Then
Unite Austin 2017 - Writing High Performance C# Scripts

Talks may be lengthy, but they stick to mind.

1 Like

@Antypodish
Thanks! I will absolutely watch those!

I think second vid may be too technical, plus is near 2 hours long. And also can be quite dated (2017). Since then lots has changed.

This maybe better instead. Is still technical, but without going deep into programming. Rather discusses concepts, used in Spellsouls Universe demo.
It also mentioned about raycasts.
Unite Austin 2017 - Massive Battle in the Spellsouls Universe

Also, come visit
Data Oriented Technology Stack forum section.
Is better to ask Jobs / DOTS (ECS) related questions, than scripts forum.
Check out some pinned threads.

1 Like

I don’t have all the answers, but I can answer some things, hopefully.

Probably the main advantage is the ability to use Burst to compile the jobs. It’s possible you could use Burst on other code, I’m not sure, but it’s very simple to do with a job. (Just add [BurstCompile] to the job method.). Burst can really speed things up, so I personally wouldn’t want to miss out on Burst functionality on my background processes, if possible.

You can have several jobs running simultaneously. Unity will figure out how to distribute the workload across all available cores.

Yup. I use jobs to perform expensive calculations to determine whether certain objects can/should interact with other objects. It’s math-heavy, and spreading it out in a job over all cores really helps performance. Note, however, that there is some overhead to moving data into/out of the job. So, performance gains will depend somewhat on how complex it is to set up and then evaluate the results of the job.

I don’t think so. And in most cases, job code should be optimized to not have access to any other data than the data you explicitly feed to it. Much of the performance gains of jobs (especially if you enable Burst) is based on the structure of the data being highly predictable. That’s where you start dealing with all the “native” collections, rather than accessing objects by reference.

You should just watch the video on this. It’s quite a different approach, and one that you’ll probably appreciate more after watching a simple demo of how to doing things using ECS.

You should take a look at RaycastCommand (https://docs.unity3d.com/ScriptReference/RaycastCommand.html). This allows you to batch up a bunch of raycasts to be executed basically as a job. Note, however, that setting up the RaycastCommand can potentially be expensive. In my game, I actually use one job to set up the RaycastCommand, and another job to evaluate the RaycastCommand results.

1 Like

@Antypodish

Again, thanks for the suggestions! I will take a look!

@dgoyette

Thanks for those answers! That’s quite helpful!

I think I have a better understanding of those features now.
I have also watched a few talks and I may watch a few more, but from what I have seen I would say most things seem more complicated at first and especially more time consuming to deal with regularly than the traditional workflow, possibly making other optimizations more efficient, and one of the most crucial issues is that entities are missing UI and I’m not a big fan of having to deal with external packages and Burst seems to be one, even seemingly requiring additional software installations.
For me, the most interesting of those features is the C# job system and I am probably going to move some calculations which are done regularly onto another core or other cores.
There were suggestions to use ECS for everything you can, but I don’t think that’s a good idea.
You may be able to save 0.1ms somewhere, but that doesn’t help when the game never gets finished due to an increasingly involved workflow. So I’m just going to use it for things that actually seem to make sense.
Of course, most demos use an absurd amount of entities to simulate the admittedly impressive performance boost, but for most games, I think those scenarios are very isolated use cases that don’t have a lot to do with a game overall, but maybe a specific subset of what you need. And for that, those features seem great.

If you jak simple 2d platformer, ECS / DOTS may be overkill. Classical OOP may be more than enough. If you need some of extra performance, you can consider Job System, before even thinking about DOTS.

However, if you start making new game, you could work with Data Oriented Pattern rather purely Object Oriented. That would allow potentially shift to DOTS.

Regarding a bit more writing in DOTS, what you gain, is massive performance boost. And modular design. What’s more, you gain for almost free multithreading capabilities.

In terms of packages, Unity moves the APIs to packages, which are installable via package manager. So you will end up installing some packages anyway, regardless if you will use 3D, 2D, Jobs, DOTS. That is, so you install only what’s necessery.

Once you get basics concept of DOTS, by using hybrid approach, you can easily use current UI APIs.

1 Like

@Antypodish

Thanks for the additional details!
I have seen an example about hybrid approach as well, which seemed more generally useful to me. Especially considering current UI. So yes, that could make sense for some parts of a game as well.
I am working on a 3D RPG and I don’t require much parallelized data. The most useful cases for using C# jobs and ECS would be

  • VFX by using a builtin particle system that natively uses those features. But I guess, so far, that’s not the case. Because then, I could make use of a more performance friendly particle system without the headache and time investment of programming one myself. Right now, there only seems to be support for adjusting ParticleSystem, but it doesn’t seem to run ECS and C# jobs itself.
  • moving raycasts onto another core
  • moving ongoing calculations (combat, statuses, etc.) onto another core
  • AI

Concerning packages, I don’t quite like that, because you have to needlessly keep things in mind, but when this is Unity’s general way of managing the engine now, so be it. But even then, if I understand correctly, Burst can be implemented last minute. So I will not bother with Burst right now.

Yes particles can be one thing. Managing server and all instances with NPCs and players can be other.
Or some fancy character abilities.

While technically yes. It need to be make sure, you make design, with burst in mind. Certain things are not burstable. If you design withouth that in mind, you may add yourself extra work, when come to make it burstable.

2 Likes

@Antypodish

Thanks! I can absolutely see why moving combat system (including characters’ abilities, etc.) onto another core makes sense, but I guess even then, ECS wouldn’t make much of a difference, especially since I probably won’t be able to keep all cores busy all the time. So I probably won’t even need to optimize code that doesn’t run on the main thread.

  • Unless, of course, rendering related things could be moved onto other cores, which I read about is technically possible, but I’m not really planning on going there unless specifically needed.
    If I understood correctly, it’s important to keep potentially useful C# jobs in mind, right? Any job could use the burst attribute? And if there is ECS, it will be even more efficient because of the data layout?
    Maybe I should try one or two approaches using C# jobs and ECS so that I get a better understanding of how it will be useful and feasible to keep data in a way that has C# jobs and ECS potential.

DOTS does nice things to rendering massive amount of entities.

If I understood correctly, it’s important to keep potentially useful C# jobs in mind, right? Sure thing. That assuming you are enough familiar with C# and OOP already. Just having in mind, will allow potentially optimizations, which otherwise could be easily missed. Also, if you ever decide move to Jobs, it may be too much, or at least huge amount of work, as you may never considered, what involves.

Certain things are none burstable. Burst may be even not permitted. So knowing what can be and what can not, allows to keep design in such constrains.

In short yes. And whatever magic ECS is doing behind scene.

Saying all that …

You indeed may don’t need at this stage such optimization. Is quite a journey. And adds much time to dev, when learning. Do classical OOP way. And when approaching potential limitation, after already optimized code, Jobs followed by ECS could be considered.

I suggest give a go to Jobs, to get the concept. At least you will be aware, what they are capable of.

2 Likes

Thanks for the reply!

Interesting. I wasn’t aware there are limitations to the attribute for Burst.

Personally, I think the game I am developing will make most use of culling (frustum, occlusion) and manual enabling and disabling of GameObjects, maybe some LODs (if time permits) as well as instantiating and destroying from Resources.

I don’t really need to render a huge amount of entities (unless particles could be entities out of the box, in which case I might have a higher range of particle count to work with), but I could increase polycount if there was a way to do some of the more directly rendering related things. But I guess most of that is already done on GPU anyway. Also, I don’t know if development time would even make sense, because GPUs are designed to do that, so they are more efficient than CPU cores. - Maybe doing some post processing on another CPU core rather than GPU, but I’m not sure if that makes sense.
I have seen some impressive uses of GPU particles, but if that already caps the GPU capabilities, the rest can’t be drawn.
So I guess a lot of decisions can only be made by continuously profiling during development and see whether or not the game tends to be CPU, GPU or RAM bound.

That said, C# jobs and ECS sure are interesting and I will surely take another look and see where I can implement a scenario for performance testing.