Library folder takes up more than 1 GB on a clean 2021.3.0f1 URP project. Isn't that bad?

I think I should try and call attention to the Unity devs for this. 1 GB is just too much, especially if they wanna make URP the new standard.

On 2018 (non-URP) versions the Library folder would take up about 30 megabytes on a clean project.
On 2021.3.0f1 (non-URP) the Library folder takes up about 300 megabytes on a clean project.
On 2021.3.0f1 (URP) the Library folder takes up 1.2 gigabytes on a clean project. How did that happen? Why?

I think that they should try to reduce the size a little, because this means that if someone has 10 Unity projects (say some small jam games) that’s 12 GBs kicked out of that person’s hard drive, and as more projects are created that computer’s performace will be degraded (because in my experience even a few GB on or off make a big difference, EVEN on a 1-terabyte HDD)

“Delete the Library folder from old projects”? “Get an SSD”? Well, those might actually be good ideas. But aren’t there ways for the Unity devs to reduce that folder’s size? I mean, just HOW did it become that large and why is it necessary for it to be that large? Can someone at least explain that to me?

1 Like

A gigabyte is nothing. Once you start developing a project it’s going to swell way past that. My last work project using the new pipelines totaled 130 GB and out of that only 30 GB were assets.

2 Likes

It’s a bother even on non-URP. The Library/Artifacts folder gets bloated, and I think maybe the overall size of the Library folder is also affected by how many assets you’ve downloaded from the store. I can’t clearly recall, but I think I had a clean project weighing in ~4 gigabytes before I backed up my (many) assets and fully removed/reinstalled Unity.

Speaking of which I recommend deleting the folder every once in a while as Unity is not perfect at removing old and unnecessary files from it. I have noticed on occasion that the folder is smaller after completely regenerating.

Only if they have been added to the project.

URP uses Burst now doesn’t? If so that’s 800mb in the library straight off for that package.

Honestly the size of Unity installs really bugs me and when you take into account the UPM cache, the extracted UPM packages, the copies of those packages per project ( why when you can’t edit them? ) then Unity is easily becoming as large as Unreal installs.

Most of my client projects used to take a few GB and that normally includes the library, but nowadays it feels really difficult to keep it lean, not to mention it seems like lots of packages needless end up forcing unused resources to be included in builds.

another pet peeve is simply the number of files a project creates, tens to hundreds of thousands if you include the library, mostly all tiny files that I’m sure adversely affect loading times, and most definitely affect upload/download times for cloud back ups. It’s gotten to the point where I don’t even bother backing up Library folder any more except for a couple of client projects and will often just delete the Library folder altogether once a project is finished with.

Honestly surprised that Unity haven’t invested in packing files together. Not sure if that’s really applicable for meta files, but pretty sure it could be done elsewhere. Thinking about it I wonder if it would make sense to provide the option of using a dll version of packages instead of loose source files.

Only for HDDs. SSDs are designed for this and in fact when working with small files won’t achieve their maximum performance unless you are accessing many of them at once. Meanwhile HDDs are both slow at accessing small files and can only do so one file at a time.

Sort of. SSDs have trouble processing small file reads even sequentially more because of OS filesystem limitations than they do any sort of technical ones. This is still a major problem in Windows.

I have made a request to rewrite SRP 14 and 15 to reduce the size of API and Packages as well, especially Burst.

1 Like

I thought we were over the days of being precious about our storage space? My computer has nearly 10TB of storage and I still have space for more drives.

1 Like

Mind, not every one can buy 2-4TB of space. Not everyone earns £, €, $. For some indies may be still expensive. Specially if needing for backup drives and holding other files than just Unity projects. That assuming holding files outside clouds.

1 Like

I mean, storage space is the cheapest component of building a PC. So cheap it’s a joke how cheap it is. A 1TB drive here is about 50 dollarydoos (Australian dollars), 2TB is 80.

SSD’s less so, but if you’re on a budget then what can you say?

1 Like

It’s not only size economy but also possibly effecting editor responsiveness and scripts re/compilation.

1 Like

I think the only code getting recompiled is the code you edit and any assemblies referencing the edited assembly; ergo, packages aren’t getting recompiled when you edit your scripts.

However in the context of domain reload speeds, that’s a fair cop. Though I imagine there are more factors than pure package size that affect their overall performance.

“Possibly” doing a lot of heavy lifting here.

No, but most people can buy a 512GB SSD. Here is one with a DRAM cache.

https://www.amazon.com/ADATA-SU800-512GB-3D-NAND-ASU800SS-512GT-C/dp/B01K8A29CS/

Here is one without a DRAM cache.

https://www.amazon.com/TEAMGROUP-AX2-Internal-Compatible-T253A3512G0C101/dp/B08CK7T9FG/

You generally want a DRAM cache for performance reasons but if you’re on a budget it’s still better than an HDD.

I have about 6 terabytes of storage total, but fast storage sits on two 500GB SSDs and I keep running out of space on both of them. So, yeah, it is better when software doesn’t waste space.

1 Like

Speaking of which I found a solution: a directory junction (aka symlink) to the PackageCache folder. Packages will be extracted to and stored there but only projects that reference them in their manifest will load them when being opened.

While testing I was able to have multiple projects open at the same time and was able to import packages into one without affecting the others. The only problem that occurred was when I tried to remove a package from one. The others immediately locked up. Once they were restarted though they added the package back (since it was still in their manifests).

I don’t see any reason why Unity cannot do this themselves but until they do nothing prevents you from doing it.

3 Likes

Interesting. I was considering this whilst writing my earlier reply, however I was and still am concerned that directly symlinking to the extracted package cache will lead to weird issues. You discovered one, that when removing a package it presumably deletes the extracted package from the cache, meaning other projects have to unextracted it again. I have a sneaking feeling there might be more gotchas though, since the cache folder is now being used by Unity for two different purposes at the same time.

That doesn’t discount the concept though, but I think I’d be more comfortable making a secondary cache to symlink into.

I can understand why Unity choose to do it this way as having the packages embedded into the project, albeit within the Library cache means you can be sure not to accidently update or alter that code, but ultimately I’d like to have a preference setting to choose between embedding or use the global package cache and let Unity take care of the potential conflicts - i.e. instead of deleting the package its reference is just removed from the manifest.

The global package cache has another issue though as if like myself you have dozens or hundreds of legacy projects from clients and personal work then the NPM and cache folder will quickly become huge as you create new projects or regularly update a subset of packages in legacy projects. I regularly delete the NPM & cache folders to try and stay on top of this, but checking now I see these folders are back to taking up over 8GB! Almost 40% of that is just three Burst versions ;(

I had an idea of creating a ‘Package Manager Manager’ that would track or parse Unity project manifests and keep tabs on packages and versions used. This would highlight projects where you could maybe update low impact packages ( e.g. Visual studio support ) and remove all unused package versions from NPM. Trouble is as soon as you create a new project you’ll download and install the default package versions unless you remember to update the manifest first.

So ultimately I feel just deleting the Global Package Cache folder periodically is probably the easiest method, but I’m still tempted occasionally to see if a manager app might be worth while.

That’s actually a great idea. Will try that one later, thanks.

I usually create a lot of small projects and prototypes while learning or following tutorials and it just feels wrong to have so many large duplicated files for the package cache. So much so that I stuck with only using 2019 LTS and have no plans to try out the new LTS versions unless I start some ‘real’ project.

I really hope unity would care more about duplicating so many files all the time.
I might be wrong but it is my understanding that a package version x.x.x is always the same and should not have any differences between projects so I’m not sure why the package cache is not something made globally instead of duplicating the same files over and over again.

Edit: I guess one could say its important to make custom changes to a package’s code on specific situations to adapt to your project, and in that case it would be necessary for them to be local. But I at least have never needed to do that and even if that was indeed required for some reason, unity could ‘mark that packaged’ as changed and make a local copy of only that.

2 Likes

I have an ongoing issue with no the total size of project data but the use of lots of tiny files that are so good at slowing down even the best storage systems.

Optimal file size is about 64K and up and anything below the native minimum file size is just wasting space (2-4K).

How hard would it be for Unity to tackle both by creating a file format that compresses and combined these small files for faster loading and better performance with a slight cost overhead when updating files.

Compression levels could even be optional for various hardware platforms.