I’m not sure if anyone will be upset if I post this question here. Transfer speeds are mostly important to me because of how long it takes me to backup my Unity project on a local drive.
I am getting ridiculously slow transfer speeds. It takes me 2-3 minutes to copy roughly 1GB of information from one SSD to another or from one folder to another on the same SSD. I thought maybe it was my SATA cables or maybe because my ADATA SSD sucks. I bought a couple NVMe drives thinking it may solve the problem. Problem not solved. I’m still getting roughly the same transfer speed from the SSD plugged into the “Hyper M2” port on my motherboard. I noticed in Window’s Device Manager that there were missing drivers for “PCI encryption/decryption controller” so I updated those thinking that would solve the problem. It didn’t. Everything looks fine in the Device Manager. Anyone have any suggestions? Is 500MB a MINUTE a normal SSD transfer speed? Does the bus on my motherboard suck or something? Thanks in advance.
People would need more information. For large file transfers, SSD performance varies greatly. If the SSD has a cache at all, once the cache gets filled the write performance you get will max out at the performance of the underlying nand flash performance, which can actually be quite slow.
Are you seeing high transfer speeds at the start of a transfer, only for the transfer speed to basically fall off a cliff? That is expected behavior when a cache gets filled.
QLC based SSD’s have become very popular, because they pack more data into fewer cells, but they generally have the worst performance once the cache is filled.
Cache technologies also vary from drive to drive. More expensive models often have a dedicated DRAM cache, some a dedicated SLC cache (fast nand flash), and some will use a portion of the drive’s unused nand flash in SLC mode as a cache. The last one is the cheapest for manufacturers to implement, but requires you don’t fill up your drive or you won’t have any cache.
Are you copying the entire project out every time you do a backup??? That’s kinda … sub-optimal. Plus you’ll burn a lot of needless drive space. Not only that, you’re probably copying the entire Library folder out too, which definitely DOES NOT need to be copied. That thing can get massive and is pointless to backup.
If you use source control (such as git) you can push to remote drives (including remote computers, thumb drives, remote services like Bitbucket and Github and Gitlab) and the only thing that goes is what actually changed.
Personally I use git because it is free and there are tons of tutorials out there to help you set it up.
As far as configuring Unity to play nice with git, keep this in mind:
I’ve tried four different SSD’s and they are all hampered to roughly the same transfer speed, which is why I am assuming it has something to do with other hardware or drivers I’m using. I had a HDD on the same computer and it seemed to have roughly the same performance as the SSD’s.
No, there is not a drastic change in the transfer speed at some point during the transfer process.
I realized a few days ago that copying the entire library was probably pointless so it’s good to hear an affirmation of my assumption. Thanks for the links. I’ll take a look at them. I’ve been backing up my project to Google Drive because that way I didn’t have to take the time to learn how to use Github.
I love git but I understand the above sentiment. It can be a bit daunting. Good news is that there are TONS of tutorials and help out there.
I recommend you spend one hour to learn the basics: make a throwaway repo, make a little empty project inside it, set up the .gitignore, add the project, then just start adding random scripts to spin cubes, save scenes, exit unity, make a new commit, then make changes, make a new commit, etc.
This way you get “cozy” with git before you commit your project to it.
Personally I use SourceTree for my git client but it has lots of issues too. Git Kraken is another, and I think Github actually offers their own desktop client but (disclaimer) I haven’t used that client.
Are you copying the project folder itself? If that is the case, you’ll see extremely slow performance because of the large number of very small files. I don’t know what the underlying reason is, but you get a pause in the transfer when one file finishes and another one starts. Since Unity projects often have tens or hundreds of thousands of little files, you get overall very slow transfer rates copying the project folder.
When I backup a project folder, I always make a zip file (or other compressed archive file), then copy that over. You’ll get the real transfer rate your hardware is capable of then.