Backup for project

Hi,
I’m looking for a backup software for my project.
I have tried Git a lot without luck. (Github, Bitbucket, gitlab, And some control panels)

Someone know about backup software that can handle all my 20gb project?

I preffer locally backups / Cloud backup but under 5$ per month.

Thanks :slight_smile:

What does “without luck” mean?

1 Like

Git dosent push…
everytime After 2 hours i’m getting error:
fatal: the remote hung up… [I don’t remember more then this]

Did you try increasing the buffer size to the largest size in your repository?

https://confluence.atlassian.com/bitbucketserverkb/git-push-fails-fatal-the-remote-end-hung-up-unexpectedly-779171796.html

We use git with lfs, works fine. 20 gig repo

1 Like

For local backups Acronis is likely your best bet it works really well,
entire system and even just Folders, schedule once a week or every hour if you like.

I also use Robocopy to mirror whole projects to external Drive, but be careful with it, easy to wipe stuff too.
One of the nice things about Unity, Projects are in there own folder, copy anywhere.

Except this is really a bad system if you want to have easy access to prior versions.

I’m going to try this.
I’ll update when push is finishing

It’s backup, not Version Control.

1 Like

For local backups get an inexpensive NAS storage device, get two 2TB HDD’s. Set them up in RAID 1 (data is duplicated between the two drives for redundancy, for total space of 2TB for the NAS). Total cost should be around $150.

Install Winrar, 7Zip, or your favorite compression tool. Zip up your entire project folder at the end of any development session and copy it to the NAS storage (optionally leave out the Library folder for reduced file size).

Alternatively set up a computer locally as an SVN server.

Either of these has no ongoing costs other than electricity.

2 Likes

After 3 hours i got this error:

fatal: sha1 file '<stdout>' write error: Broken pipe0 KiB/s
error: failed to push some refs to 'https://xxx@bitbucket.org/xxx/pvp.git'

Its sound pretty good the nas part.
Can you explain more?

It is a small device, usually controlled with a webui (though often linux based) which just exposes storage to your network. Often in the form of Windows drive shares. You install whatever hard drives you want in them, and have the NAS server set them up however you want. Usually these allow for installing multiple drives. Usually for the safety of your data you will want to configure them using some form of RAID.

RAID 1 is the simplest and requires only 2 drives, where everything is mirrored between both drives. So you end up with total space available equal to the space of a single drive. RAID 5 is another good choice, but requires at least 3 drives, and you end up with usable space equal to 2 drives. RAID 10 and other forms of RAID generally are for larger numbers of drives, but may be supported. Don’t use RAID 0, because while it offers a single logical drive with the maximum size (the total size of all physical drives added together), if any single drive fails in the array you can lose everything. Not good for a backup device.

No need for relatively pricey SSD’s. Just get standard HDD’s with a good reputation for quality. 2TB HDD’s generally will be around $50-$70 dollars each. NAS servers can get down to the $50 price range. Just shop around for what you want out of it.

1 Like

Or set up the NAS as you already described, then put your Git repo on that. This is really simple and gives you the benefits of both versioning and a local NAS.

Git doesn’t need a fancy server system if you’ve got file access to wherever it’s being stored. In other words, you can host a git repo as just flat files on any accessible file system. Mark it as “bare” when you create the repo and your git client will use that repo to store all the version data without maintaining a working copy of the files. (This means that you can push to it and pull from it, but you can’t access or modify the files directly.)

In TortoiseGit this is trivially easy to do, too.

  1. Create a folder where you want your origin repo to be.
  2. In Windows Explorer, open that folder, right-click, and “Git Create repository here…”
  3. When prompted, tick “Make it Bare” and press ok.
  4. Now set up your working repo by cloning as usual, giving the file path to your bare repo as the origin URL.

If you do this, I would still recommend setting up a backup system. The above helps a lot compared to just keeping stuff on your local machine, but it’s still not as robust as keeping your repo online and thus away from your physical work environment. Having a copy of your data that’s physically elsewhere is a good idea.

Can you tell us more about your setup?

Taking 3 hours for a push seems crazy, unless you’ve got a really slow connection or something.

How fast is your internet connection? Are you behind a proxy or anything (eg: on a university campus, usng a shared internet connection, usng your laptop from work, public free wifi…)? What version of Git are you using? What Git client are you using? Did you get the same errors with different Git hosts, or different ones? Are you the only one working with the repo?

How big are the commits you are pushing? Have you set up a .gitignore to exclude the Library folder and friends?

2 Likes

My internet connection is 5mb max.
And I’m using my home network.

I’m using the last version of git and i used yesterday git bash to upload. Then i got the error i mentioned.

I tried to upload to: github, gitlab, and even i tried to setup git on my old computer. I pushed a some big files to see if its pushing and it pushed successfully. And when we are talking about my unity project i gets everytime errors.

I want to push all of my project so my commits are 13-15gb.
I have Gitignore but i don’t think i ignored library folder (I should move this folder either? )

That’s a colossally huge commit, and likely the source of your problem.

A few things to consider:

  • Most online hosts have size restrictions, and as far as I know most free hosts have restrictions of 10gb or less. (Azure DevOps has “unlimited” repo sizes.)

  • As someone mentioned earlier, Git itself also has limits on the sizes of individual commits/pushes (or something like that).

  • Aside from Git itself, servers usually have limits on both maximum upload time and maximum upload size. Chances are that after 2 to 3 hours you’re just hitting one or the other of those limits.

You definitely want to ignore the Library folder, yes. In fact, there are only 3 or 4 folders you should include for a Unity project. See here and make sure you’ve followed the other steps, too.

If you haven’t followed those steps previously then back up your project before making changes. I’ve never seen changing serialization mode or other Editor settings cause a problem, but why take that risk?

1 Like

I tried yesterday to upload without the library and eberything was seen working fine, till i wake up now and i saw error "fatal’ error the remote hung up unexpected " someting like that. My router is automatically reset on late night. You think this can cause problems also?

(I uploaded 6gb/10gb of my assets folder)

Check the size limits of your host. They may simply not allow repos to get this big.

If it resets during an upload then yeah, that’s an issue.

I’m pretty sure that most of my commits are measured in kilobytes rather than megabytes. The exceptions are when I need to modify art or stuff that includes binary data.

If there’s a significant art update sometimes that’ll be a few dozen or maybe even a few hundred megs. We consider those to be massive updates, to the point that we give one another a heads-up when one is coming. Your push is many times that size.

If you commit just a small part of your project - a few megs at most - does it work then?

One thing you might be able to try is using Git LFS. Basically, it’s an add-on for Git which handles large binary files differently from the types of files Git was designed for. I’ve not used it before as I’ve not had to, but maybe it’s exactly what you need here?

Otherwise… I think the biggest single push I’ve ever done was when moving a fresh version of a project to Azure DevOps. We put an existing ~3gig (I think) project into a fresh repo after removing a whole bunch of stuff and deciding we’d rather not keep the history. I don’t remember this causing any problems. Still only a fraction of the size of what you’re talking about, though.

Yes, if i commit small part its working.

I’m trying to push the project to git hub now, i’ll update when it finish

Github only have 10 gig if you dont pay