NOTE: I first reported this issue here (a compilation of issues/questions) but no one has been answering that thread for the past two months so I’m giving up.
Using “Update a Previous Build” regenerates ALL addressable bundles in the group (with new names) if you just add a new addressable to the group. So, we add a new asset to a group, leave everything else untouched, and update the previous build (using the bin file generated inside the project). It regenerates all bundles with different hashes. If you look inside the bundles with a hex viewer to see what caused the hash to change, you see that 2 IDs changed (random sequences of characters).
This goes against the docs saying “Asset bundles that do not contain updated content are written using the same file names as those in the build selected for the update”. How would a game like Rocksmith, with weekly content, deal with this? You add a new Addressable and all users have to redownload ALL addressables that they had downloaded before…
I reported this as a bug 2 months ago in the bug reporter, issue 1234854. It was marked as a duplicate of this one, but in the meantime that one was marked as “By Design”, because apparently it was related to local asset groups. My report was for remote assets (using “Can Change Post Release”), which have the exact same problem.
This is a very important bug! We literally can’t publish our game while this bug exists, otherwise the smallest update with a new file will cause ALL USERS to redownload ALL ADDRESSABLES they had previously.
I’d appreciate a reply. I’ve just replied to my issue asking to reopen it, and I’ve replied to my older thread, and now I’m creating this one since I’m having a lot of difficulty reaching the people that are working on this. Thanks!
My findings were that if you purge the addressable cache between running the source build and the content update build then the bundles with unchanged assets would still get renamed with different guids appended to them.
If you leave the addressable bundles generated from the original build then it will not rename the bundles with no asset changes.
Hmm this seems different but I’m not sure. We’re never purging caches and all bundles still get a different hash, and so a different name. And our Addressables group is remote while yours is (please correct me if I’m wrong) static and local. But the bundle updating/generation seems to have some bug that is causing their contents to change even if nothing changed.
By the way, the team seems to have reopened the older issue, maybe because of my request~~, although no one replied yet~~. (They have now!)
You’re welcome! After speaking with the team, it looks like a fix for this is in 1.9.3. Please let us know after trying that version if you’re still experiencing issues!
I’m facing the same issue. It’s reproducible in Addressables 1.17.17. Just delete the Library folder (to simulate a colleague who just pulled the repository).
If you now want to update the build, every hash of every bundle produced is different, meaning, that every client would re-download hundreds of MB in the worst case, since everything changed, even if it actually didn’t.
This must be a bug? How can the bundles be different if nothing changed? Did anyone resolve this, or is the majority never updating their bundles?
I am pretty sure this is… sigh… the intended effect.
From what I understood, their asset import pipeline is not actually deterministic. They cache the build results of every asset. Then when you re-build the assets, there is ‘black box magic’ that is performed to determine if the asset on disk has been modified since it was last built. If not, the old result is recycled. (Deterministic build output! Sure, I guess).
But if you delete your library folder, all that information is lost, there is no pre-built data to recycle, so the assets are rebuilt and more often then not the data is different by a few bytes/kbs even if ‘nothing changed’ in the source asset. Different machines configurations can result in different build outputs of the same file too.
Seems like Unity Cloud Build ‘works around’ the issue by restoring the library folder between cloud builds and ensuring the machine configuration is identical between builds.
You need to make sure: The same machine is doing the addressable builds for your game indefinitely for the rest of the life cycle of your game - else your users will have to redownload everything again.
I guess Asset Bundles, since that is the underlying system for Addressables.
‘Update Previous Build’ (Addressables feature) doesn’t actually do anything. It just performs some validation on the build environment to make sure it is the same as the last time the bundles were built.
If the environment is the same, the Update Previous Build is permitted and the Asset Bundle system takes over and builds the bundles. Then the bundles are either successfully ‘recycled’ (provided source assets havent been touched/the library folder is intact/you didnt look at the Project Folder funny) resulting in no changes/‘deterministic’ result, or the bundles are entirely different resulting in user needing to ‘update’ (download) said bundle.
You could, in theory:
Build bundles
delete the library folder
Dont touch anything else…
Open project/reimport
use ‘Update Previous Build’
point your editor to the addressables state bin file
Addressable says all ok! Starts building ‘update bundles’
Result? Update Previous Build operation succesful! Except ALL the output bundles are different to last build…
Issue occurs on 2020.3.20f & 1.19.11. Building from different machines produces different bundles (with no changes in their source assets) which with a real world CI makes the whole addressable system pretty much unusable.
Hi. Just ran into this myself. I tried all sorts of stuff and new bundles were generated every time. I even tried literally copying and pasting a duplicate of the original project, same hard drive same machine, and the copy generated new bundles even though the copy should be exactly the same as the original, Library and all.
Like someone mentioned above, as things stand currently the previous build update needs to happen in the exact same copy of the project every time or else it seems all new bundles will be generated no matter what. This is obviously bad since 1) It makes A\B testing or maintaining multiple versions of a project basically impossible and 2) since we’ll eventually have to switch computers, hard drives will fail, etc, every time something like that happens players will need to download everything again. In the short term I think this is a problem that can be tolerated but if it does not get addressed I believe I will need to look for alternative ways to serve my content on the fly. ( I have plans to serve lots of it so this is a critical issue for me. )
Btw, someone also mentioned above that Cloud Build is somehow able to deal with this correctly? If that’s true then perhaps we can get someone from the Cloud Build team to enlighten us as to how exactly they’re doing that so we can follow their example?
Thanks. I’m on 1.19.18 btw so using a fairly recent version as of this writing.
UPDATE:
Messed around further and found out that you CAN make a copy and not have new bundles generated if you:
Copy the project that does not generate new bundles on to ANOTHER disk so the folder name stays the same.
Switch drive letters so the disk the copy is on has the drive letter of the original. So for instance on Windows, my original was on drive “Z:”. I copy the project to “Y:”, rename “Z:” to “W:”, then rename “Y:” to “Z:”.
Open the copy on what used to be “Y:” but is now “Z:”. “Update a Previous Build” and no new bundles are generated.
This method also works with fresh checkouts as well. So, you can check out a copy of the project on to a new drive with the same folder name as the original. Switch the drive letters. Copy the Non-checked in parts of the project like the “Library” folder from the original to the new checked out folder. Open the newly checked out folder with the copy of the old project’s non-checked in parts. Run “Update a Previous Build” and find that no new bundles are generated.
I suppose this is basically what the Cloud Build team is doing. And supposedly the Cloud Build team has to go even further by making sure the machine specs match because of shader compile differences. Hey, but at least it’s possible to switch drives on the same PC before a drive dies which is good news in my book.
So should I be getting differently named bundles if all I’m doing is changing their contents?
I have our localisation data in an addressable group, if I update a single word of that then a second bundle is generated with a different hash appended. I would expect it to replace the bundle there otherwise the player is not going to have two bundles to download.
This really cannot be the intended behaviour.
A post build step to rename the bundle and update the catalogue might be possible but I expect thing would need changed in the bundle too for that to be a viable option.