We have a project that is 20GB. I am curious what other people use for source control? Should I put git on my own server so I can have more then 1 GB of space? All the free accounts only have 1GB (github, bitbucket). I would rather put the code on a hosted, with backups server and not my own.
Also, is it best practice to put even the large files into code control? If not, it would be helpful to know where others are putting them as a backup.
You have to pay for the repository or self-host it. No other way around it.
Git lfs could handle binary files. Another way to go about it is to separate source and binaries. Put source code under git control and store binaries elsewhere.
Beware of storing stuff (like models) on Google Drive, though. This thing has a nasty habit of nuking files you’re working on.
I would actually question how much value you get out of version controlling binaries. Sound, model and texture files don’t generate sensible diffs. So putting them in version control doesn’t really give you anything more then having version 1, 2 and 3 in a folder somewhere.
If you only version control code and text assets then you should come well within repo limits.
Why is your project so big? I have a project that is around 6gb and I am using Gitlab for source control. If you want you can try using Gitlab. Although it supports up to 10gb of disk space per project but if you use git ignore to ignore those unnecessary files, I’m sure you’ll be able to upload it for source control. You can give it a try if not then you’ll have to use your own self hosted server.
Because you’ll be able to revert the whole project to previous working state instantly. If binaries are not under same version control, you’ll have to manually revert them to a working version, which may be trouble, if you, say completely changed your object/component setup meanwhile.
It is not about having sensible diffs, but more about having sensible version history.
There is source revision control asset store tool I wrote called Chronos Time Machine, It handles big projects and treats your multi-gigabyte art assets the same as it does code assets. It uses unity’s own project view for status and control. It truly extends Unity into a first class Source control system. It has true collaborative file lock (will lock even unity out of editing assets and meta files). It is good for both Single and Team user environments. Just point it to a remote server and the repo is managed (and served up) by Unity itself. You can from within Unity roll back/forward a file or folder, either partially or fully. It is not limited to code assets but will do this in a sensible way for whatever type of asset it is. And of course you can diff compare your assets against any commit.
You’re best off buying an external USB drive and running automated incrementals daily unless you have a fibre optic internet connection and money to spend on cloud storage, even then a local USB drive with incrementals is much more convenient and faster.
I’m still looking for a not-in-my-home-backup solution that can store a backup of my main data harddrive. Currently that’s 2.4 TB though, because for most art jobs I create 5 to 50 gigabyte of data in versions of Photoshop files etc. per project. And I need to archive all that stuff because even ~10 years later a client might want me to do something based on the old source files. It has happened. And yes, I could delete more than half of that data because I don’t need all 10 progressive versions of a file, but the time to sift through that is much more expensive than buying bigger hard drives.
Does anyone have experience with storing harddisks in bank safes or something like that?
It’s not just assets that need to be stored, but also the development data used to create the asset. It’s not uncommon for a single Gimp/Photoshop project file to be a 1Gb alone just for all the layers used in the creation of a texture/NRM used by a material. I don’;t know about Photoshop, but Gimp projects can explode in file size very quickly.
Since this almost 2 year old thread got necroed anyway I can now answer my own questions (hopefully for the benefit of someone stumbling over this in the future). Hard disk in bank vault works and isn’t that expensive, but don’t kid yourself, you’ll update that think only twice a year or so.
And I tried Backblaze: it can realistically utilize about 15% of my 10Mbit upstream because I’m in Germany and all their servers are on the US westcoast it seems. Support said there’s nothing they can do for me, so I uninstalled it. Next thing I’ll look into is a “Hetzner Storage Box” because they are hosted in Germany: https://www.hetzner.com/storage-box
this is why everyone just uses SVN. xp-dev is actually in the UK and if you ask they’ll host you IN GERMANY! (on 1and1 I believe - world’s biggest data center)