Title Image

Owner’s Corner: Automatically Zip and Publish Backups

It has been a goal of mine to always make the map backups available to everyone. Not only does that make it possible that if my house blows up someone could take the server as it exists and get it back up and running, but it's only fair. Just because I have access to the maps, doesn't mean everyone else shouldn't. If you wanted to download the map, create local renders, make cool imagery, that helps everyone (for example, check out this image I made from an overhead view in minutor of my castle at NewGullonia). Nevertheless, the process of making these zips available has always been a pain for a lot of reasons, and this is my documentation of the current solution which FINALLY has it automated.

First, let's talk about storage. A single copy of the backups takes up about 8GB, and that's 2 3.5GB files for world and old world, then a few smaller maps for the smaller worlds. My whole web hosting plan is 3GB, of which I've used about 500MB usually. That web hosting plan hosts this website and all my websites, so there's not enough space there. We also used a dropbox account with public links, and for a minute google drive. The result was a disjointed mess of a system that was a pain to maintain.

At my house, I use one of my servers as an HDA, or home desktop assistant. It's the manager for my whole network, and the system I use is called Amahi that is built on top of Fedora to be a one-click manager for all home services. It is my file storage, my DNS server, my DHCP server, domain controller, DLNA server, local web server, bla bla bla. One of the features on that is a home-built system like dropbox that's hosted by you but can be mirrored by the actual Amahi organization, which is what I use. I have 50GB of space up there and it can be publically-addressable. What's better, rather than having a public link to a file on MY server (which would use a big chunk of my home bandwidth) the mirroring means I upload it once then the bandwidth isn't on me anymore.

Nevertheless, there is still a big bandwidth gloat...I have to get the 8GB of backups from the server to my home server, then upload them to the Amahi server. That's 16GB of transfer, which is a lot. I've decided, for now, that's fine for once a week, but once the new server is in my house, that 8GB of transfer from the server to MY server is just a LAN transfer so who cares.

I'm jumping around a bit, but this whole process has 2 major components, each taking a ton of time and resources. Step one is zipping the maps with maximum compression (very CPU heavy, and very Disk I/O heavy), and step 2 is uploading the maps to my home server (very networking heavy). To mitigate these 2 things, I have it scheduled to happen at about Midnight on Sunday Night/Monday Morning. It's also set to announce the processes on the server and in the console, so you have a warning when it's going on.

The script does the following to each of the map's backups:

  • Check that existing zipped backups exist
  • If they exist, delete them
  • Zip the backups with maximum compression
  • upload the zips to surf's server

Here's what the code looks like:

Spoiler Inside SelectShow

Leave a Reply