It looks like if an asset is found in a addressable via a reference, (and found as a reference in a non-addressable asset) this doesn’t stop duplicate assets from loading into memory when that asset is loaded from the assetbundle. :[
Is this… normal? Am I going to have to pack all assets in
So it looks like I’m getting a bunch of warnings when I analyze the “Check Duplicate Group Dependencies” rule, and I’m sure this is the issue - I didn’t realize this could happen, since I assumed assetbundles wouldn’t pack an asset if it was to be included in the build, but I should have realized that the assetbundles doesn’t know what is in the build.
So now I’m running the automatic fix, and I’ll see if this solves it. It looks like it’s going to take a very long time to do so, which is frustrating, since I was hoping to make this fix happen automatically.
I think I’m going to not use Addressables unless I absolutely have to, since it makes a 3 minute build turn into a 40 minute build (not including running the fix of course). This is surprising since the entire build ends up being only a gig, non compressed. Well, I’ll just code everything to use normal references, and then when I want to (for a final build, for example) I can run it so the references are wiped and then addresses are used. This means the non-addressable mode will have essentially the entire world loaded into memory, but that’s okay for a non-final build.
It’s a bit surprising that the only way to have any control over resource management (other than using Resources, which most say not to use in the final product) is to use AssetBundles / Addressables, which seem to be quite… complex? Just from the standpoint of how long it takes to build them. I don’t really understand why doing a build without addressables only takes a few minutes but packing addressables takes a super long time, but I’m sure that’s because I don’t know what’s going on under the hood.
Duplicates do massively increase build time, in our case from 10 minutes or so to 1.5 hour for a cached build…! Moreover, there are a few bugs that cause build and Enter Play Mode performance issues that are being fixed, profiling what happens can give you some good pointers of what is happening.
If you can avoid requiring an asset bundle build for now, your workflow will be faster. At the same time, it should be faster than what you (and until very recently when I did a deep dive and sorted out some things) are experiencing.
Welp, I give up for now. The fix in the analyser seems very chaotic, it takes over an hour to run and each time it runs it still ends up having duplicates, and strangely has very different results each time. It’s incredibly, incredibly frustrating that the only way other than resources to have any control whatsoever on loading an asset is to use addressables / assetbundles - they seem wayy wayy overkill for why I’m trying to use them.
What’s odd is it is not the duplicates that are causing slow builds, from what I can figure out. I did realize I was accidentally compressing the addressables, which I don’t want to do during development, so that helped… Slightly.
How did you solve the duplicate problem, if I may ask?
It sounds like there may be some sort of dependency chain that is confusing our system. If you could file a bug against unity with a repro project that’d be a huge help in us tracking it down.
I will note that in 1.1.9 we added “some optimizations for calculating analyze rules” (from changelog). We were doing some inefficient things in 1.1.7 that should be better now. If you aren’t on the latest, perhaps try that.
I’m sorry this is frustrating you. Hopefully as we work the kinks out it can become more aligned with the sort of workflow you need.
I’ll try to replicate it, although I fear I won’t be able to :S
I’m currently using 1.1.9
Several things I noticed with that rule:
Each time, a new group would be created. This would result in multiple duplicate groups existing, which would result in more duplicate warnings, and so on.
By default, compression is enabled for the new group is enabled, which I feel like it should not be the case: clearly the first time the fix is run it is meant to be a test run, and compressing makes it take even longer.
And thank you, hopefully as the toolset develops I will be able to use it, as it seems like a very powerful system