Unity assembly definition files slower, not faster - how to compile faster?

I get impatient having to wait 5-7 seconds between each code change, to be able to start playmode, or get an update if the playmode is already running. So I took a look at Assembly definition files, but I’m not sure if I’m doing something wrong.

At first I tried the example project, and it changed from 7 to 3 seconds after adding the asmdef (I only tested the interdependentScripts scripts and assembly definition feature).

Before I tried this on my own project, the compile time was 5.5 seconds. When I added assembly definition files, let it compile (20s the first time), and then change one of the files, I get about 7 seconds compile time. So it’s slower with the Assembly definition files.

With a super simple project with only one script + the CompileTime script, the compile time is 3.5 seconds. Is that the best I can get on my computer, or is there anything else that can speed it up?

I have a Macbook Pro 15" Late 2016 (2,7 GHz Intel Core i7). What are the shortest possible compile times on a better computer? Is there any point in upgrading to a better computer, or is this what to expect from Unity?

Here’s my folder structure. I have a few scripts, and some 3rd party Unity Sample UI which can be deleted later. Is there anything I could change to make compilation faster?

Note that I tested first putting an asmfile (forgot the asm-prefix I used elsewhere) in the 3rd-Party folder, but I wasn’t sure if it was grabbing subfolders or not…

2 Likes

I made a “Standard Assets” folder, and moved Editor, GUI and Utils into that and managed to down the time to 4.6 seconds. So at least I save a second of my life each time… :slight_smile:

Two things that are worth mentioning:

  • If you change a script in a asmdef which is referenced by all (most) other assemblies, the compilation will also compile all the referencing assemblies, ie you’ll get a (near) full rebuild.
  • If you change a script in an asmdef not referenced elsewhere, compilation should be faster than rebuilding the entire project with no asmdefs at all.
  • Always test changes to a single script file, not a rebuild all.

I guess an initial build (full rebuild) would be slower with many asmdef but as you work on individual features compile times should increase.

I did a quick test in our most complex project with 422 .cs files under Scripts (more under Plugins). It takes 11 seconds to compile a script in our “Scripts” assembly. We just yesterday added asmdefs and made it work, so “Scripts” is not further subdivided at the moment.

I added a new folder with a single monobehaviour script and an asmdef. When I change this script, compilation takes 8 seconds or a little less.

I count the time the rotating wheel in the lower right corner appears until it disappears. For this particular project, it stops animating far earlier than for it to go away. Something about our project, perhaps some post-compile steps or something, makes this last part with the wheel “stuck” about as long as the wheel animating.

If the animation of that spinning wheel is an indicator of the script compilation phase then I get the following results:

  • change of “Scripts” assembly: 7 seconds (wheel animation)
  • change in “asmdeftest” assembly: 3.5 seconds (wheel animation)

So using asmdef helps compile times. But personally I’m more interested in how asmdef helps splitting up code, ie it effectively prevents too many undesirable direct references in code. They also become visible by looking at the asmdef … you may notice that one day the Input module has a reference to the Network module and then you question the necessity of that reference and investigate… while previously you would have to look and find that reference somewhere in one of the input scripts - which happened randomly at best.

PS: I’m using Windows 10.

2 Likes

Thanks for the feedback, didn’t see that animated wheel though. I’m using mac, will look for it next time… :slight_smile:

I had a similar experience where it’s really doubtful if it’s any faster. I have 2s compile time in a new project and in my big project it takes around 8s.

I created the assembly definitions files and then created a new folder with a single script to see just how long it would take to compile, it took around 8s. Maybe I did something wrong or assembly definition is really not that great.

The problem is how this feature was promoted. When you change a script a bunch of checks are done, the actual code compilation is a small part of all of those. So it does speed up compilation time, but compilation time just isn’t what takes most of the time.

FYI it’s better on small projects, the larger projects get, the less time is spent in actual compilation as a percentage. At least that is what I have observed.

I guess the question is this then:

How do you get faster times after saving scripts when developping a game with lots of scripts?
I have a 1000 scripts right now but by the end of the game I’m working on I might have 5000.

What do other game devs do to work fast on big projects?

1 Like

On a decent sized 3d project I have average 20 second wait for every script change. I don’t think the number of scripts or amount of code matters at all, because I can see how long it takes MCS to compile.

What appears to matter most is the number of assets you have with script references. Unity appears to take most of it’s time validating those and possibly updating some internal data and such.

So the only thing you really can do is move a lot of common logic outside unity, which I do anyways for other reasons like unit testing. And just adjust your coding flow to save less frequently.

Alright, it just seems a waste then that when I create a new script with a new assembly and that it’s not linked to any prefabs/assets/scripts it still takes a long time to process.

Speed is not the only benefit for Assembly Definitions. The possibly even bigger benefit is enforced modularity.

So for instance, in the past we had this problem a lot, mainly with “Unity Developers” as opposed to “Experienced Programmers” but even the latter are prone to making terrible shortcuts.

Imagine you have these subsystems in your project:

  • Game / Business Logic with Data
  • GUI
  • Network
  • Animations
  • VR Controllers
  • many, many more … some of which you can’t even tell whether they are GUI, Logic, Animation, Network … probably a little of everything. And they’re called VariableManager, SystemManager, ProviderManager, ControllerManager, AdministrationManager, BehaviourManager, …

Soooo … you need the GUI to react to a Controller input? Easy, just make the GUI subsystem a singleton and send it an update directly from the OVRPlayerController (never mind hacking around in 3rdParty code either). Then the GUI decides that some values need to be updated over the network. GUI just calls Network.SendMessage. Right. Forgets about updating the Logic module though. Game goes out of synch.

These things are terribly, terribly easy to set up with Unity and I’m very happy that finally, everyone is forced to make an informed decision - is it really a good idea if the GUI needs the Network module? Or does it smell like a design flaw?

While we cannot enforce linkage, we can easily check what links to what. I’ve written an editor tool that makes a diagram (using graphviz) out of the asmdef references - from time to time I check if there’s something fishy about the architecture of the project. Are there any links that should not be?

Likewise, one can no longer have circular references. These are especially difficult to refactor once they’ve grown over longer time. For instance, in an older project we have Logic and GUI tightly coupled but on top of that, also the Input module and Postprocessing and Serialization and Animations and VR/Leap stuff and so on. After refactoring the whole project to Assembly Definitions, this “big bunch of files” remains even though logic and GUI really ought to be decoupled. This is a constant source of bugs, either the GUI not being updated properly on Logic/Data changes, or vice versa the GUI resetting values in the Logic module, or worst of all: stackoverflow because Logic and GUI bounce each other messages which is hard to track because it goes through several layers of setters and modifiers and limiters and validity checkers.

This would not have happened (at least not to this degree where refactoring this would mean redoing most of the app) if we had Assembly Definitions to begin with.

Therefore, I decree:
Any decently complex project requires Assembly Definitions to pack code into modules, not for speed but to enforce the app’s architecture!
(also helps to keep developers sane and friendly)

7 Likes

Thanks for that insight, but however, is there any way to make Unity do the checks etc faster, so I can start the play mode faster, and not wait 8 seconds each time?

I thought of getting a better computer, but I have a Macbook Pro Late 2016 which should be ok, it would be kinda silly to buy an iMac Pro for $15.000 and get 7 seconds “compile time” instead. :slight_smile:

Have the same problem

Compilation become even slower(

So far, a careful separation of our scripts in our project under different assemblies showed an increase in compilation time by a factor of 2!! I don’t think the gain of speed works if anything it makes compilation much slower in our case.

It would be great to know what gets recompiled exactly to track down the reason of the slow down :frowning:

We are also very frustrated about the ‘overall’ compilation time during coding/debugging phase where parent lib must be changed frequently. Though, we really love the ability to compile into separated dll’s within Unity.

Here’re some observations:

  1. It appears that (at least on my machine), each asmdef compilation takes minimum of 2 seconds.
  2. Total duration is dependent on number of asmdef compilation per change.

For example, if we change the ‘leaf’ asmdef’s source files, it compiles in 2 seconds. But, if we change the parent asmdef (and hierarchy requires 8 asmdef recompilation), it will take average of 2*8 = 16 seconds.

While if we compile everything (without asmdef), it takes average of 4 seconds.

Asmdef compilation just has too much overhead.
Really starts to appreciate how Visual Studio enables to compile so quickly.

1 Like

I’d like to add that asmdef sometimes totally freeze VS when it has to reload the projects because of code change. When you have many asmdef, it’s faster to kill VS and restart it…

Hi,
I in fact had similar experience. I changed lot of scripts, to adapt and sort, to be suitable for Assembly Definitions, with aim of speeding up the compilation time. Resulted of creating bunch of *.asmdef files (approx 30), for roughly of my 300 scripts. + On top of that, there are 3rd party scripts, which I will revise at some point, if I can put in assembly definitions, if they are not yet done. I will possibly go for further modularation, as I will come back to some scripts, which need future revision.

However, indeed I haven’t noticed any compilation time improvement, for the size of my project. So every little code change, for some references scripts, may take even 22 secs. I was initially upset, on taking attempt of spending time on *.asmdef files. And that is even on SSD drive, along with 3.4GHz CPU. Also, during compilation, there is few seconds, where VS freezes. But it do not stuck for me (Win7).

Yet, despite no compilation time gain (at least not noticeable), I am happy of reorging scripts into more modular state, removing many cross references. I was very conscious about avoiding cross-references, and making coding modular. But still, I have discovered multiple cases, where I had to rewrite bits of codes, to make suitable for Assembly Definition.

At the end of the day, every new asset I create from prototype, I put into Assembly Definition. Then import to the main project. This keeps code much more tidy. Also, as mentioned, it become apparent and obvious, which scrip is referenced, by looking at the *.asmdef file.

One thing I am curious, but not taking an attempt to trying now, is if limiting number of assembly build platforms, to minimum, would help at all. Not doing it now, because every change of *.asmdef, takes some time to recompile.

Other than that all good, since I wanted project, to be as modular, as possible.
But no significant time gain on compilation, as was the main goal.

Yes this is about what I discovered. I have only about 40 script files, and it takes 9-10 seconds to compile each time. It’s not a huge time, but anything above that is annoying.

I read something about a burst compiler, has that anything to do with this? It would be awesome if it would be faster to compile, also that it didn’t make Unity hang/jerk each time a code was changed…

There’s one way to compile faster it’s to put your assets in a folder called Standard Assets (do this only for the scripts of the asset store) or the script you know won’t change.
Secondly if you have in your project settings > editor > sprite packing: enabled in editor. disable that you’ll start the game much faster.

This helped me, maybe it’ll help you.

I noticed the same effect. Compile times got worse after using assembly definitions.

  • 78 script files with 16200 LOCs find Assets/Scripts -name '*.cs' -exec cat {} \;|wc -l

  • Processor: i7-7700K 4x4.20Ghz

  • All plugins have been moved to Plugins/

  • To measure the compile time I used the CompileTime.cs script

no assemblies

 4.5s changing unrelated Editor script (CompileTime.cs)
10.5s changing script define symbols in build settings
 5.5s changing loose dependent script (MainLogic.cs)
 5.5s changing heavy dependent script (Utils.cs)

assemblies

 4.5s changing unrelated Editor script (CompileTime.cs)
14.8s changing script define symbols in build settings
 8.3s changing loose dependent script (MainLogic.cs)
11.2s changing heavy dependent script (Utils.cs)

All the work restructuring and adapting plugins was good for nothing!

2 Likes

I reduced my recompile time from ~11 seconds to ~6 seconds by introducing assembly definition files, for changes to my core game scripts.

  • Main.dll (my core game stuff, around 40 scripts and all my game specific assets)
  • ThirdParty.dll (all external code and any code I very rarely change myself, tons of scripts and assets but I have no number on it)

Note that there is no Assembly-Csharp.dll anymore, as recommended by the docs you should make sure all files are covered by assembly definition files to get the full benefit.

Note also that actual compile time is now very fast, something like 0.5-1 seconds. According to logs the rest of the time is the editor reloading things and that time grows large when you have a lot of assets - not sure if it’s only the scripts part of it, or other assets like materials, meshes, scenes etc.

1 Like