It has been a goal of mine to always make the map backups available to everyone. Not only does that make it possible that if my house blows up someone could take the server as it exists and get it back up and running, but it's only fair. Just because I have access to the maps, doesn't mean everyone else shouldn't. If you wanted to download the map, create local renders, make cool imagery, that helps everyone (for example, check out this image I made from an overhead view in minutor of my castle at NewGullonia). Nevertheless, the process of making these zips available has always been a pain for a lot of reasons, and this is my documentation of the current solution which FINALLY has it automated.
First, let's talk about storage. A single copy of the backups takes up about 8GB, and that's 2 3.5GB files for world and old world, then a few smaller maps for the smaller worlds. My whole web hosting plan is 3GB, of which I've used about 500MB usually. That web hosting plan hosts this website and all my websites, so there's not enough space there. We also used a dropbox account with public links, and for a minute google drive. The result was a disjointed mess of a system that was a pain to maintain.
At my house, I use one of my servers as an HDA, or home desktop assistant. It's the manager for my whole network, and the system I use is called Amahi that is built on top of Fedora to be a one-click manager for all home services. It is my file storage, my DNS server, my DHCP server, domain controller, DLNA server, local web server, bla bla bla. One of the features on that is a home-built system like dropbox that's hosted by you but can be mirrored by the actual Amahi organization, which is what I use. I have 50GB of space up there and it can be publically-addressable. What's better, rather than having a public link to a file on MY server (which would use a big chunk of my home bandwidth) the mirroring means I upload it once then the bandwidth isn't on me anymore.
Nevertheless, there is still a big bandwidth gloat...I have to get the 8GB of backups from the server to my home server, then upload them to the Amahi server. That's 16GB of transfer, which is a lot. I've decided, for now, that's fine for once a week, but once the new server is in my house, that 8GB of transfer from the server to MY server is just a LAN transfer so who cares.
I'm jumping around a bit, but this whole process has 2 major components, each taking a ton of time and resources. Step one is zipping the maps with maximum compression (very CPU heavy, and very Disk I/O heavy), and step 2 is uploading the maps to my home server (very networking heavy). To mitigate these 2 things, I have it scheduled to happen at about Midnight on Sunday Night/Monday Morning. It's also set to announce the processes on the server and in the console, so you have a warning when it's going on.
The script does the following to each of the map's backups:
- Check that existing zipped backups exist
- If they exist, delete them
- Zip the backups with maximum compression
- upload the zips to surf's server
Here's what the code looks like:
Spoiler Inside |
SelectShow> |
#!/bin/bash
#
# Title: script.zipbackups.sh
# Author: Joseph Gullo (surfrock66) (surfrock66@surfrock66.com)
#
# This script makes zip files of the most recent backup, then uploads
# it to surf's home server where it is publically accessible on
# the web.
#
# The script is a bit of a mess as it contains many different
# iterations of the backup commands and combinations of active
# and inactive maps.
#
# Fill a variable with the current date and time
_now=$(date +"%D %R %Z")
# Announce in the console and in-game that backup zipping is happening
echo "Creating zipped backups for the website at $_now!"
screen -dr "Bukkit" -p 0 -X stuff "$(printf "say Creating zipped backups for the website at $_now!\r")"
# Write the current date and time to the file in the backup.
echo $_now > /Backup/webbackups/mc_backup_update.txt
# If a backup zip exists, delete it.
if [ -z /Backup/webbackups/world.zip ]; then
rm /Backup/webbackups/world.zip
fi
if [ -z /Backup/webbackups/world_nether.zip ]; then
rm /Backup/webbackups/world_nether.zip
fi
if [ -z /Backup/webbackups/world_updates.zip ]; then
rm /Backup/webbackups/world_updates.zip
fi
if [ -z /Backup/webbackups/world_the_end.zip ]; then
rm /Backup/webbackups/world_the_end.zip
fi
if [ -z /Backup/webbackups/world_old_nether.zip ]; then
rm /Backup/webbackups/world_old_nether.zip
fi
if [ -z /Backup/webbackups/world_old.zip ]; then
rm /Backup/webbackups/world_old.zip
fi
# Zip backups for inactive worlds
#if [ -z /Backup/webbackups/world_legacy.zip ]; then
# rm /Backup/webbackups/world_legacy.zip
#fi
#if [ -z /Backup/webbackups/world_quest.zip ]; then
# rm /Backup/webbackups/world_quest.zip
#fi
#if [ -z /Backup/webbackups/world_kermit.zip ]; then
# rm /Backup/webbackups/world_kermit.zip
#fi
#if [ -z /Backup/webbackups/world_skyworld.zip ]; then
# rm /Backup/webbackups/world_skyworld.zip
#fi
# Zip with maximum compression the most recent backups of each world
zip -9rq /Backup/webbackups/world.zip /Backup/backup.00/world
zip -9rq /Backup/webbackups/world_nether.zip /Backup/backup.00/world_nether
zip -9rq /Backup/webbackups/world_updates.zip /Backup/backup.00/world_updates
zip -9rq /Backup/webbackups/world_the_end.zip /Backup/backup.00/world_the_end
zip -9rq /Backup/webbackups/world_old_nether.zip /Backup/backup.00/world_old_nether
zip -9rq /Backup/webbackups/world_old.zip /Backup/backup.00/world_old
# Zip commands with maximum compression for inactive worlds
#zip -9rq /Backup/webbackups/world_legacy.zip /Backup/backup.00/world_legacy
#zip -9rq /Backup/webbackups/world_quest.zip /Backup/backup.00/world_quest
#zip -9rq /Backup/webbackups/world_kermit.zip /Backup/backup.00/world_kermit
#zip -9rq /Backup/webbackups/world_skyworld.zip /Backup/backup.00/world_skyworld
# Re-capture the current date and time, so the console shows how long zipping takes
_now=$(date +"%D %R %Z")
# Print to the console and in-game that zipping is complete and transferring is happening
echo "Zipping Complete at $_now! Pushing backups to public folder!"
screen -dr "Bukkit" -p 0 -X stuff "$(printf "say Zipping Complete at $_now! Pushing backups to public folder!\r")"
# Change directories to the backup directory for relative directory references
cd /Backup/webbackups
# Batch backup push for active worlds
scp -q mc_backup_update.txt world.zip world_nether.zip world_updates.zip world_the_end.zip world_old_nether.zip world_old.zip teh3l3m3nts@surfrock66.yourhda.com:~/
# Batch backup push for ALL worlds (active and inactive)
#scp -q mc_backup_update.txt world.zip world_nether.zip world_updates.zip world_the_end.zip world_old_nether.zip world_old.zip world_legacy.zip world_quest.zip world_kermit.zip world_skyworld.zip teh3l3m3nts@surfrock66.yourhda.com:~/
# Individual backup push for active worlds
#scp -q /Backup/webbackups/mc_backup_update.txt teh3l3m3nts@surfrock66.yourhda.com:~/mc_backup_update.txt
#scp -q /Backup/webbackups/world.zip teh3l3m3nts@surfrock66.yourhda.com:~/world.zip
#scp -q /Backup/webbackups/world_nether.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_nether.zip
#scp -q /Backup/webbackups/world_updates.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_updates.zip
#scp -q /Backup/webbackups/world_the_end.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_the_end.zip
#scp -q /Backup/webbackups/world_old_nether.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_old_nether.zip
#scp -q /Backup/webbackups/world_old.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_old.zip
# Individual backup push for inactive worlds
#scp -q /Backup/webbackups/world_legacy.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_legacy.zip
#scp -q /Backup/webbackups/world_quest.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_quest.zip
#scp -q /Backup/webbackups/world_kermit.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_kermit.zip
#scp -q /Backup/webbackups/world_skyworld.zip teh3l3m3nts@surfrock66.yourhda.com:~/world_skyworld.zip
# Re-capture the current date and time, so the console shows how long uploading
# bakcups takes
_now=$(date +"%D %R %Z")
# Print to the console and in-game that transferring is complete
echo "Completed pushing backups to public folder at $_now!"
screen -dr "Bukkit" -p 0 -X stuff "$(printf "say Completed pushing backups to public folder at $_now!\r")"
|
This entry was posted on Tuesday, October 16th, 2012 at 12:24 pm by
surfrock66 and is filed under Owner's Corner.
You can leave a response, or trackback from your own site.
Leave a Reply