Reducing script compile time or a better workflow to reduce excessive recompiling

SOLUTIONS: Each of these solutions reduce compile time. Use one or all of them together to get the biggest speed increase.

#1 - Convert project to C# (extremely effective)
I was using UnityScript. Convert your project to C# – it compiles 4X FASTER than UnityScript! Compile times went from 15s to 3.5s. See this post for some numbers.

#2 - Convert some/all scripts to DLLs (very effective but complex)
NOTE: This solution involves a complicated setup using externally compiled DLLs. While it works, it’s probably far too complicated for most users. Compile times reduced from 15s to 2-3 seconds or less. See my answer here.

#3 - Convert ALL remaining .js files to C# (somewhat effective)
I was able to reduce compile times an additional 3.25 seconds by converting ALL lingering .js files in my project to C#, including Standard Assets. See this post for details.

Overall, using all 3 methods above, my script compile times are now down from 15 seconds every time I update a script to just 2 seconds. That’s a 7.5X performance increase! See my happy face over there? :slight_smile:


Original Post:

A common problem is that as a project grows in size, so does the time it takes to compile the scripts. When one change is made to any script, all must be recompiled. The result is a maddening amount of waiting during development and testing. My current project has 3.78MB in 164 script files (not including the default stuff in Standard Assets and Pro Standard Assets). It takes 15 seconds to compile scripts every time I make a tiny change, which can translate into hours of wasted time per day waiting. I’m currently working on a Core i7 3.6 + 12GB mem. BTW, I’m coding in Unityscript.

Questions:

  1. Are there any ways to reduce compile times apart from getting a new CPU? I doubt getting an SSD would help here. I already know about this: Overview: Script compilation (Advanced)

  2. I’m curious about other people’s compile times compared to number of scripts / megs of scripts, specifically with regards to C#. I’m using Unityscript, and I’m wondering if it would be significantly faster compiling in C#. I don’t think there’s any way possible I could convert my project to C# at this point, but maybe it would help in future projects.

  3. I feel like I’m missing something with regards to a smart workflow. I did some work on C# in Visual Studio recently (non-Unity stuff) and found it to be very quick and efficient because of real-time syntax error underlining, real-time error warnings, break points and debugging, and excellent IntelliSense. Working in Unity is a completely opposite experience. I feel stuck in the mud. As far as I’ve found, there’s no syntax error underlining or other error warnings without doing a compile (read: waiting…), MD debugging integration is frustrating so I haven’t used it (it always tries to open a second Unity instance), and I can’t get MD to ever give me consistent results with code completion (it works maybe 5% of the time for me). Testing or bug fixing new code is always an endless back and forth between MD and the editor play button with a bunch of Debug.Log() calls thrown in as needed to find the problem, and of course there’s all the waiting to compile just to add or remove a log call each time. (I know about the inspector debug mode to see vars, but on scripts with tons of arrays or complex objects it always crashes my machine so I can never use it.) Adding and debugging one small feature can take a day or more sometimes because of this slow workflow. Maybe I’m doing things all wrong, I don’t know. I’m curious how some of you go about your routine when working.

  4. I’m currently getting about 600 warnings every time I compile. It’s all trivial stuff, but I’m wondering if this might be making it take a bunch of extra time to compile and whether it would be worth taking the time to try to clean up the warnings. Or maybe there’s a way to suppress warnings so it doesn’t have to warn me every time? (Edit: Actually 90% of these warnings are things like “System.Collections is never used” in a Monobehavious script. I can’t very well strip out the include from Monobehaviour to make this warning go away.)

  5. In MD, there’s an option to Compile Assembly. I assume that means it’s just going to compile whats in that particular assembly and not the whole project. Is there any way possible to split up your scripts into smaller assemblies so I could just re-compile the ones I’m working on at the moment. (I doubt it and I hope this question doesn’t sound too dumb. It’s a just shot in the dark. :stuck_out_tongue: )

Thanks!

1 Like
  1. disable warnings Disable warning messages - Questions & Answers - Unity Discussions
  1. I would prefer fixing the warnings over deactivating them. If you fix all warnings right when they first appear it’s no big deal and won’t waste a lot of time. The moment will come when you assign a variable to itself and a warning saves your ass. Just manually assign null to all values you only set in the editor to fix the “will always have its default value null” warnings.

  2. You can use DLLs in Unity. You could compile some parts of your code into a DLL, possibly some of your utility classes that are pretty much complete and tested as they are.

1 Like

I can’t speak for Unityscript, nor whether switching to C# would improve your build times.

At least with C#, I find it far more productive to build DLLs in Visual Studio and deploy them to the Unity project. I have some scripts to help with this which I might put on the asset store sometime - the scripts provide a UI inside Unity that lets you control some integration with the Visual Studio project, so it can make it automatically deploy the assemblies, add references to UnityEngine/UnityEditor in a manner that’s independent of the Unity install directory, etc. I also have a smart launcher which knows how to launch or focus the right Visual Studio process for the script being edited - by default Unity tries to use its own auto-generated solution file, which is rubbish. The launcher also allows a fall back to MonoDevelop if the file is not C# code (e.g. UnityScript, or shader code).

Visual Studio with Resharper is just insanely more productive than MonoDevelop. When I lost my last job, where we had this setup, I really wanted to give MonoDevelop a chance, but after about eight months of casual development I just couldn’t stand it any more. The worst bit is when its auto-completion goes wrong and just decides to delete random characters from your last few lines of code, making it uncompilable…

@ 3) Yes, when you want to debug your code, you must close your Unity Editor (not MonoDevelop) and then start it. In order for MonoDevelop to be able to debug in play mode it must attach itself to the Unity Editor process. That’s why it tries to start a new instance (and fails if your Editor is already running)

I’ve never had any trouble attaching MonoDevelop to a running Unity process, but I also don’t do it very often and certainly not as a matter of course, as it seriously slows down code execution.

Thanks for the replies everyone! :slight_smile:

Unfortunately, this only applies to C#. I did try it myself and it doesn’t compile. Also, Iooked through MSDN’s list of warning and error codes and can’t find anything for the biggest offenders, the namespace not used warnings. These warnings can potentially appear in every single .js file you make because the Monobehaviour class has built-in imports of various things like System.Collections, UnityEngine, and UnityEditor. If you don’t use something from each namespace, you get a warning. UnityEditor is the silliest because the vast majority of your scripts are probably not going to be editor scripts so you’d never use anything from UnityEditor namespace and therefore get the warning in most of your files. I found a very ugly workaround which I posted [here](http://forum.unity3d.com/threads/103287-How-can-i-do-Warning-message-Namespace-System.Collections-is-never-used-(BCW00?p=1014079&viewfull=1#post1014079) which involves putting a dummy function in every script to use each namespace. Yuck.

Hmm, I agree disabling them en masse is a bit heavy-handed. Unfortunately, most of the warnings come from Monobehaviour and are not fixable without the ugly hack I posted above. I suppose it’s better than nothing. I’ll “fix” them and see if it helps compile times.

Now this is intriguing… I’m trying to imagine how this could be implemented to speed up compile time.
The majority of my scripts are Monobehaviour scripts. Utility classes only account for a handful of files and are not the bottleneck, so making DLLs out of those wouldn’t help much. I imagine I couldn’t pre-compile my Monobehaviour scripts as I’d have no way to assign them to objects at that point, correct? (Edit: I just found out this is not correct – you can drag/drop monobehaviours from a pre-compiled dll as seen in this video.) Possibly some of the functions of each monobehaviour script could be broken out into separate non-monobehaviour classes, the data shared via objects, and those pre-compiled, although that would take a heck of a lot of reworking over all these scripts. Also, it doesn’t seem to me like there would be a way to compile UnityScript into a DLL… maybe MD can do it. This is worth more research.

Edit: Since you can drag and drop monobehaviours onto objects from within a pre-compiled DLL, I can potentially see replacing all my finished scripts with DLLs which would make compiling super fast. Now I just have to figure out if its possible to compile a UnityScript file into an external DLL…

Wow, sounds like you’ve refined your workflow very well. Your C# tools wouldn’t help me with this project, but after all this is done I’m certain to be done with MonoDevelop. I’d really like to have a more efficient way of working like that.

I’m curious, what are the types of things you deploy via DLL and those you cannot and must use Unity to compile? About what percentage of your code is deployable via DLL?

I get more frustrated with MonoDevelop every day, but there’s no alternative for me. I’m surprised you can even use auto-completion. I’ve never had it delete random chars, but just about everything about it is broken for me. It’s REALLY slow (like 3-5 second delay when you start typing), it almost never suggests the things I want like local variables or class properties, it’s horribly inconsistent about what it does suggest to the point of being totally unusable. Again, we have that C# UScript gap here and most of the very experienced people on this board are using C#, so I just chalk it up to that.

I should give this another chance and see if its usable. I guess I just have to get into the habit of starting Unity via MD → Debug, so long as it doesn’t try to open another Unity instance each time I try to debug. My project is big so starting Unity can take several minutes.

1 Like

Almost everything works fine from precompiled DLLs these days. There are three rough edges I haven’t got a solution for - one, you need two debugging files next to the DLL if you want MonoDevelop to be able to debug it, which is slightly cluttering. Two, the scripts within the DLL appear as a flat list, as if they were files inside the DLL “directory” - the flat list is annoying, it’s a shame it’s not broken down by namespace. And three - when you double-click a script in the DLL, Unity doesn’t try to launch the right source file and line number as usual - it tries to open the DLL itself.

None of those are very important though. Double-clicking on error messages in the console window works fine, and that’s the most important bit.

Yes, it’s hard to guess how much difference that makes.

Agreed. Those downsides are not bad at all. If smart usage of this could save me 5-10 seconds off my compile time I’d be ecstatic.

There is one big question in my mind about implementation (assuming I am able to compile DLLs from MD in Uscript to being with, which I will try doing next). I really doubt there’s a way I could substitute the pre-compiled DLL version of a class for the already assigned reference made by drag and drop of the script from Unity. It seems to me you’d have to start your development using the pre-compiled technique. Pretty much if its a monobehaviour based script, I’m kind of out of luck as those almost always have some serialzed data.

Re: Compiling a DLL from MonoDevelop in UnityScript

Okay, it does work! You cannot choose to compile to library from the MonoDevelop GUI, but you can set it in the *.unityproj file:

Change:
<OutputType>Exe</OutputType>

To:
<OutputType>Library</OutputType>

Lots of other stuff can be set in the unityproj file. Just open Assembly-UnityScript-firstpass.unityproj created by Unity in the root of your Unity project for more settings you can use.

Excellent! (in Mr. Burns voice) Can’t wait to implement…

I read on here once that someone changed all the un-nessesary public var’s to private and it improved things…

Yes, it will lose track of the references. If you save assets in text format, you might be able to pull some tricks to programmatically fix up the broken references afterwards, but it won’t be fun - or, it will be the quirky kind of fun, not the mainstream kind.

I just want to update the thread with the rest of my findings on the topic of precompiled DLLs.

If you use UnityScript, it’s not possible to compile a DLL Unity can actually use. Even though it’s possible to compile a DLL, you cannot set a compile target and therefore will always get a .Net 4.0 DLL which may cause errors in Unity. See my post here for details. (Note: The particular error I was having has a workaround, but I still don’t know whether using .Net 4.0 dlls in Unity is a good idea for other reasons.)

Regardless, since my project is so far along and makes use of so many MonoBehaviours with serialized data, it’s not really possible to offload much into external DLLs without destroying all that work setting up prefabs and serialized data. This is the kind of thing that’s only really useful if designed for at the beginning of a project.

It’s not even feasible to offload things like managers into an external DLL at this point because they make use of so many classes that are defined in the Scripts folder (classes with serialized data like MonoBehaviours). A compiled DLL cannot communicate two-ways with classes defined after it in the same way that scripts in the Standard Assets folder cannot communicate with scripts in the normal Scripts folder. To provide a link between these, it would be possible to make base classes for everthing in the DLL that the Scripts could inherit from, but again, this is a design time issue as changing any of the classes with serialized data now to inherit from a different base class would destroy the serialized data.

Which brings me all the way back to square one. It appears there’s no real solution to making Unity compile significantly faster if you’re stuck using UnityScript.

(Edit: Well, I came up with a workaround that uses the same internal compiler Unity does and outputs a .NET 2.0 library files without the annoying errors. See this post for details. )

You can even offload MonoBehaviours to a DLL (the DLL of course needs to reference UnityEngine.dll (found in C:\Program Files (x86)\Unity\Editor\Data\Managed) but you probably have to wrap them around a class inside your script folders, i.e.

// in your DLL
public class SomeClass : MonoBehaviour {
    // your code
}

// in your script folder:

public class SomeClass : YourDLLNameSpace.SomeClass {
// can be empty, everything is already in the DLL's class
}

This would allow you to move even w/o losing the references, maybe even create a script that will create the script files. But it’s harder to debug this way

1 Like

I knew extending the class would work as I mentioned above but I had no idea it would keep the serialized data references! I just tested it out and indeed it does. Thanks! So maybe there is some possibility for this if I can possibly find a way to force .Net 2.0 compilation. (Currently exploring all the stuff in C:\Program Files (x86)\Unity\Editor\Data\Mono\lib\mono\2.0 to see if I can find the way Unity does its internal compile pass).

Oh yeah, and namespaces don’t work in UnityScript as far as I can tell, so I have to rename the base class to SomeClass_Base or something like that, which isn’t that bad.

If the .mdb file works, at least you should get line numbers. But trying to compile a MonoBehaviour to DLL is giving me the good old IEnumerable error with pdb2mdb (a known error I’ve seen mentioned many times before).

Edit: My new workaround using Unity’s internal command line compiler instead of MonoDevelop actually outputs an MDB file for problematic classes like MonoBehaviour.

I’d like to report that two weeks later, my problem is solved! I’ve reduced my compile time from 15 seconds to 2-3 seconds. Thanks for your help everyone in getting this figured out.

In a nutshell, I moved every script I possibly could to external DLLs, though this was no easy task, especially because I’m using UnityScript. Moving to DLLs is pretty easy for basic utility classes, but most of the speedup comes from moving MonoBehaviours to DLLs which is much more difficult considering this was not planned for at the beginning of the project.

Because of how I structured things, I can keep the scripts I’m currently working on as scripts in Unity and everything I’m mostly done with as DLLs. Once I’m done with them, I can shift them over into the DLL side. (Its just a little bit easier to work on things that need frequent changes as scripts rather than DLLs, but it’s also workable just to work on everything as DLLs, though I doubt using Unity’s debugger would work from precompiled DLLs).

I’ll outline my steps for anyone interested. Note that this requires Unity Pro unless you can figure out how to edit the binary scene and prefab files. Also, I assume you are comfortable writing your own small tool programs outside of Unity to help with certain small tasks like parsing text files.


STEP 1: *** BACK UP YOUR PROJECT! ***

///////////////////////////////////
STEP 2: PREPARING YOUR DLL SOLUTION
///////////////////////////////////

Create a new solution in MonoDevelop. Inside this solution, create one project for each category of your scripts. I organized mine into a 30 projects by type arranged just as they had been as scripts. (The more code you have in one category, the longer it will take to compile.) Copy all the scripts you want to convert to DLLs into their respective project folders.

Common libraries:
Add references to one or both UnityEngine.dll and UnityEditor.dll in all your projects that need them. (In Windows, find them in Unity\Editor\Data\Managed.) Also add System from the Packages tab.

Utilities:
I had to do a little re-organizing to make sure I had a sensible dependency chain set up. My utility classes, enums, and some hardcoded data tables would need to be referenced by most/all other projects, so I organized accordingly. Add references to these projects in all your other projects that need them.

MonoBehaviours:
I wanted to be able to keep my existing code entirely as is. Because much of the code in my MonoBehaviour classes relies on other MonoBehaviour-extended types (being used in other scripts), there are a lot of cross dependencies. I didn’t want to have to wait forever for one gigantic project to compile every time I made a change (how Unity does it), so I chose to split up all MonoBehaviours into two classes each – one base class and one class extended from the base class. I then added references the BaseClasses project to every other project that needed access to those classes (all the MonoBehaviour-containing projects). The base classes can be passed around and accessed by any MonoBehaviour that needs them.

About Base Classes:
Because of the size of my project, splitting up all the MonoBehaviours into base and extended classes required that I write a little tool to help me with the task. Essentially I went through all the script files and extracted all class properties/fields and all public methods and generated a large file containing all the base classes. For properties, I copied the property declaration and assignment (if any) as is so as to . For the methods I just created stubs (blank methods) out of them. (When one of these stub methods is called through the base class, the extended class’s override method will be called instead.) All the game code will be in the extended classes. The base classes just provide a base framework.

I used the original class name for my base classes because I didn’t want to change any references in code. After I got my base classes file made, I modified each script file:
Comment out the properies because these need to be only set in the base.
Rename the class (and their .js/.cs files) with a suffix (“_Ext” in my case) to differentiate it from the base class. (You will never have to call these extended classes in code, just the base with the name you’re used to.)
Make the class extend the new base class.

MonoDevelop configuration for C# users:
Make sure your projects are set to output to .NET 2.0 or 3.5 or you MIGHT get some errors when using the dlls in Unity. Also make sure you enable optimization. (VStudio: project properties → Build → optimize code, MonoDev: project properties → Compiler → Enable Optimizations). If you do not enable this, your code will run slower.

MonoDevelop configuration for UnityScript users:
Getting DLLs to compile from UnityScript is a bit of a challenge and carries with it a couple of issues I’ll get to later. To get MonoDevelop to output a Library instead of an Exe you can’t use the GUI like in C#. Instead, open all your .unityproj files in your new solution and change the line Exe to Library.

Compiling and Merging:
Unity has a big problem which makes it impossible to just import your compiled DLLs if you’re using the base class method outlined above. Unity will not recognize any subclasses in a DLL extended from a MonoBehaviour-based base class in a separate DLL. See this post for details.

In order to get around this problem, you must merge your BaseClasses.dll with all your other BaseClass-extended DLLs. This way all the bases and their subclasses exist within one DLL by the time Unity gets it and there is no problem. (Well, except for the fact that your dropdown list will contain ALL your classes not in alphabetical order which is kind of ugly if you need to assign them.)

To facilitate this, MonoDevelop projects should be set up in a particular way. Set all projects to output to solution/bin/merging so every DLL will be in one place when compiled. Also, right click on every reference in every project and uncheck “Local Copy” – this will prevent MD from copying all the dependencies into this output folder with every compile. Also, copy the common dependencies to this folder or a subfolder: UnityEngine.dll, UnityEditor.dll, and for UnityScript users, Boo.Lang.dll and UnityScript.Lang.dll (find them in Unity\Editor\Data\Mono\lib\mono\2.0 in Windows).

Microsoft ILMerge is the tool to use for merging the dlls. It runs from the command line. It will also merge PDB files so you will have line numbers reported from Unity for bug testing, etc. Create a batch file to merge the base class and any other MonoBehaviour-derrived class dlls together into one dll. I call mine Core.dll. You don’t have to merge your utility classes as they can just be copied as is. (I recommending forcing ILMerge to ouput a .NET 2.0 class with the switch /targetplatform:v2.)

It’s unfortunate you have to go through this extra step, but merging only takes 2-3 seconds even with the 160 or so MonoBehaviours I have in my project and I only have to do it if I change something in a MonoBehaviour-derrived class.

PDBs, MDBs, and Merging:
Compiling with MonoDevelop outputs .PDB files, but Unity wants .MDB files. ILMerge also requires PDB files and outputs a merged PDB files. This PDB file must be converted to an MDB file for Unity. Unity comes with pdb2mdb.exe (in Unity\Editor\Data\Mono\lib\mono\2.0\ in Windows) which can convert PDBs to MDBs (well, most anyway). Run it from the command line pdb2mdb Assembly.dll, but your current directory MUST be the directory the .dll and .pdb reside in or you’ll get an error. Convert your merged DLL and any non-merged utility class assemblies DLLs you have as well. (Add all this to your batch file created in the previous step.)

NOTE: pdb2mdb is buggy and doesn’t work on all assemblies. I had some problems trying to convert one of my assemblies (IEnumerable error), so I ended up having to make my own pdb2mdb converter using Mono.Cecil which you can get here. It was pretty easy with help from here on where to start.

NOTE 2: .PDB files are output in the format AssemblyName.PDB, however .MDB files should be formatted AssemblyName.dll.mdb in order for Unity to recognize them. Plan for this accordingly.

Copy to your Unity project:
An addition to the merging and PDB converting batch file, also add copying the merged .dll + .mdb and any other loose .dlls + .mdbs you have to your UnityProject/Assets/Assemblies folder. (Anywhere under Assets is fine.)

Workflow:
I made a few helper programs and a pretty nice batch file to make this process smoother for me. After compiling a change to any of my DLLs, I just run a single batch file from a hotkey or any convenient launcher to detect changed DLLs, re-merge if necessary, and copy all the changed DLLs and MDBs to the Unity folder, and let me know of any problems such as locked files, etc. It’s quick and only adds one additional button press beyond telling MD to compile. (I COULD add it to an “After Build” command but I don’t want it running every time necessarily.)

Result:
Once your structure is set up, you should be able to compile each individual project relatively quickly instead of having to wait for everything to compile every time. You can also work on new scripts in the Unity Assets folder just like before, but you won’t have to wait for hundreds of other scripts to compile because they’re already dlls. Those scripts can later be moved to DLL format too once you’re finished with them.

Note about .NET version for UnityScript users:
I have encountered an error before with some .dlls compiled from MonoDevelop because MonoDevelop ALWAYS compiles to .NET 4.0 even if you try to force it to use .NET 2.0. Most of the time this isn’t a problem, but if you see some errors like in this post you may have to change some code to work around them OR resort to using Unity’s internal compiler to compile instead of MonoDevelop to force it to compile in .NET 2.0 as I outlined here. But I recommend against doing this as it requires a TON of extra steps and a series of custom programs to help make the workflow less awkward. In the end I abandoned this approach for many reasons AND the side benefit of using ILMerge to merge your DLLs is you can force it to output a .NET 2.0 even though the source DLLs are .NET 4.0.

/////////////////////////////////////////////////////////////////////////////
STEP 3: Converting your project to use the DLLs instead of the loose scripts
/////////////////////////////////////////////////////////////////////////////

First, change a couple of settings in the editor: (Requires Unity Pro)
Edit → Project Settings → Editor
Version Control: Meta files
Asset Serialization: Force text

Version control: Meta files will make Unity output .meta files for every asset which contain the important GUID for the asset. Asset serialization: Force text will make it so Unity outputs text files for all the assets which will make it far easier for us to find/replace all the references to the old loose scripts across the project. (Note: It will take quite a while for Unity to generate these files on a large project.)

Replacing Script References:
All .prefab and .unity (scene) files must have their references to the old loose scripts replaced with references to the new classes in the DLLs.

Unity uses a combination of a fileID and a guid to determine what class is referenced on a prefab or in a scene. These references are stored in the .prefab and .scene files and show up as a line of text because of the asset serialization setting above. This way, references can be easily changed in all objects across the project.

See my post here for information about the reference format.

The trick here is that you will have to build a reference table. You need to know both the guids/fileIds for all your loose scripts and the guids/fileIds for all your new extended classes. Once you get these, it’s a simple matter of search and replace across the project referencing the table.

In order to build the table, I suggest starting a new Unity project and copy all your old loose scripts WITH their .meta files to it. Create one gameobject and apply all your scripts to it. Create a prefab out of it. Write a small editor script to give you a list of all components on the object and reverse the order of the list because the prefab stores them in reverse order of how Unity’s inspector shows or the output of a for(Component c in GetComponents(Component)) loop. Filter out any components that may have been added automatically as a result of assigning a MonoBehaviour like CharacterController, etc. Then parse the prefab file looking for m_Script: {fileID: # guid: # lines and copy out all the fileIds and guids. Finally compile it into a table.

Delete the gameobject, prefab, and scripts and do the same thing again with the DLLs this time (also remember to bring the meta files as well so the guid will match when you copy it out.) Assign all the extended MonoBehaviour classes (“_Ext” classes in my case) to a gameobject and repeat. You don’t need to do it for the base classes because you are never going to assign these directly to anything, just the extended ones.

I suggest backing up your .prefabs and .scenes again now in case something goes wrong and you need to re-do the reference replacement. It’s a lot less hassle than restoring the entire backup if you have a lot of other assets.

Now with your table in hand, write a program to search the UnityProject/Assets folder for all .unity and .prefab files, search for fileId + guid pairs, and replace them with the fileId and guid of the matching extended class from the dll. (Using the suffix “_Ext” allowed me to easily find the replacement in the table.)

Now open the project in Unity and verify the MonoBehaviours converted properly by looking at some prefabs. Make sure they were replaced with the expected class and that all the serialized data fields are there. Also check and make sure any serialized object references that point to a script type now point to the new extended type.

If everything looks good, delete loose scripts.


DONE!

Your compile times should now vastly improved.

Also, as a bonus, the reference replacement process could also be used when converting an entire project to another language. I’m seriously thinking about switching my whole project to C# so I can finally have a decent IDE to work on.

EDIT: Since I made this post, I have indeed converted my project to C# and boy was I surprised at the results. C# compiles between 4X and 12.5X faster than UnityScript! Had I known this before, I wouldn’t have bothered with the complex DLL setup because C# is fast enough to just keep using Unity’s loose script scheme. Using the loose script setup, C# is 4X faster, and using the many-dll’s set up, C# is 12.5X faster at compiling. Either way you go, you can’t loose if you just bite the bullet and convert everything to C#.

4 Likes

Excellent. It’ll be interesting to hear in the future how maintainable your new setup is.

Did you ever time how long it took to compile all your original scripts into one DLL? There’s no real reason to suspect it would be any quicker than when Unity triggers the compile, but it might be easy to test and time.

Hehe… Yeah, I hope it works out. I suppose it’s not difficult to go in reverse and put it back as separate scripts if it turns out to be awkward.

Yes. Actually, when compiling the entire solution at once, it’s quite a bit slower than Unity’s internal compiler. Not sure why. I’ve had 3 setups so far:

  1. Unity compiles all non-editor scripts as one project (default Unity setup) - 15 seconds (this is not an accurate comparison since it wasn’t actually recompiling ALL scripts, just the stuff in the Scripts folder)
  2. All 30 projects in MonoDevelop compiling with the Unity command line compiler via batch file: 27 seconds
  3. All 30 projects in MonoDevelop compiling with MonoDevelop: 44 seconds

Additionally, it takes about 4 seconds to merge the files into the DLL.

So it’s a tradeoff. If you’re going to be recompiling everything all the time, it is slower, probably partly because it’s a bunch of projects now instead of like 4-5 like in Unity’s solution. But because I will only usually compile one project at a time, it will only take about a total of 5-6 seconds if its something that needs merging. If it’s not a merging class (a utility class) its near instant. Or, if I’m working on loose scripts instead of DLLs, 2-3 seconds because there’s so little left in the Scripts folder that needs compiling every time.

Probably the most awkward part of the whole setup is having to have all the fields and their default assignments in the base class.

The biggest corcern if it falls down is that you’ll lose future work due to scene corruption or something. Hopefully not! But be careful with source control and/or backups.

I meant putting all the scripts into one (or two) DLLs, rather than building lots of tiny DLLs.

A little over a year ago I was working on a codebase with a few other developers, building DLLs in Visual Studio, and we evolved to a point where we had one DLL per logical functionality unit, totalling perhaps 40 or 50 DLLs. Solution build times in Visual Studio were then pretty bad - even when few files had changed, it took it a while to convince itself that most of the libraries didn’t need rebuilding. Even within Visual Studio, this was bad for unit testing, as the test cycle time was dominted by all these no-op builds.

One of the developers restructured the solution, breaking it into the smallest number of DLLs that was possible. It was less elegant, but build times were much better with only about 5 DLLs to build (one was agnostic, not using Unity; one used UnityEngine; one used UnityEngine and UnityEditor; another was C# 4.0 with WPF; and there were one or two executables that obviously required separate projects for each).

I’m pretty paranoid about backing things up, so that should help. The fact I have the prefabs and scenes in text format is huge relief because I know that even if things get twisted up, it’s at least possible to get them untwisted. Still, the only scenario I can think of that might cause my serialized data to get corrupted is if Unity suddenly decides to change fileIds or guids on one of my classes. I don’t know how that would happen, but I will watch out for it since numberkruncher did mention having some prefab corruption when compiling dlls from Visual Studio. (I’m currently working on getting my project over to C#, so that may be an issue for me too.)

Oh. No, I didn’t test that but I can. Anyway, that’s what Unity did by default that made my compile times so annoying in the first place, so I don’t think I’d have any use for that setup.

I see. That jives with what I was seeing. The more projects, the slower the compile time. But it works for me because I will only usually want to compile one project at a time. Anyway, I’m curious to know just how much overhead it adds. I’ll post my numbers once I do the test.