Difference between revisions of "SOWN Backup Cloud"

From SOWNWiki
Jump to: navigation, search
(Added page about SOWN backup cloud project)
 
(Updated project with more up to date information)
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
The SOWN Backup Cloud project is designed as a means of backing up important content and configuration from all SOWN servers, so if there is a failure the server can be rebuilt without losing anything important.  It is a cloud as we currently have three different servers that hold backups:
+
{{UpdateNeeded|There upgraded backup servers have been redeployedThis page needs to be updated to reflect this.}}
# [[Sown-Gw]] - In Zepler (building 59) level 1 server room
 
# [[Backup2]] - In building 16 level 2 server room
 
# [[Backup3]] - In building 32 level 3 south server room
 
Currently, the backup service runs once a day, rsync-ing all the files onto a central location [[Sown-Auth2]] and then rsync-ing all these files onto the three backup serversOnce on these backup servers, a tarball is created of those files on that particular day.  All tarballs from the previous 7 days are retained along with all weekly (Sunday) tarballs from the past 30 days and all monthly (1st ofthe month) tarballs.  Periodically older monthly tarballs are manually deleted when the servers start to run out of space.
 
  
As rsync is only incremental old files will not be deleted even if they are deleted from their original locationTherefore there is a tidy-backups script that runs on each server to deal with directories where this is a particular issue, (e.g. the remote syslogs captured off nodes stored on [[Sown-auth2]] under /srv/www/sisyphus/docs/).
+
The SOWN Backup Cloud project is designed as a means of backing up important content and configuration from all SOWN servers, so if there is a failure the server can be rebuilt without losing anything importantIt is a cloud as we currently have two different servers that hold backups:
 +
# [[Backup2]] - In building 53 level 3 server room
 +
# [[Backup3]] - In building 32 level 3 north server room
 +
Currently, the backup service runs once a day, rsync-ing all the files onto a central location ([[Sown-auth2]]) and then (gzip) tarballing all these files before scp-ing this tarball onto the two backup servers.  Once on these backup servers, all tarballs from the previous 7 days are retained along with all weekly (Sunday) tarballs from the past 30 days and all monthly (1st of the month) tarballs.  Periodically older monthly tarballs are manually deleted when the servers start to run out of space.
  
== Proposed changes to backups ==
+
As rsync is only incremental old files will not be deleted even if they are deleted from their original location.  Therefore there is a tidy-backups script that runs on [[Sown-auth2]] to deal with directories where this is a particular issue, (e.g. the remote syslogs captured off nodes stored on [[Sown-auth2]] under /srv/www/sisyphus/docs/).
[[Backup2]] and [[Backup3]] are in need of upgrading as there disks are really too small (< 100GB).  As part of this process there has been a review of how backups should be managed.  There are currently two suggested changes:
 
 
 
1. Tarball the daily backup on the central server ([[Sown-auth2]]) and then SCP this to each backup servers rather than getting each server to do the tarball.  Backup2 and Backup3 have rather slow CPUs and this dramatically increases the time it takes to do a daily backup.
 
 
 
2. Run rsync in a mode where old files are deleted if they have been deleted from their original location, so the backups are a more accurate representation of what is currently on the servers being backed up.
 
 
 
=== Progress ===
 
'''meshach''' that is one of the 1U quarter depth black boxes like [[Backup2]] and [[Backup3]] (abednego and shadrach) is nearly ready to be deployed as a replacement for [[Backup3]]. meshach has a 1GB SATA drive rather than the smaller IDE disks currently on [[Backup2]] and [[Backup3]].  It is running a 32-bit non-pae version of Ubuntu 14.04.
 
  
 +
=== Planned  New Backup Setup ===
 +
Rather than collating all the files on [[Sown-auth2]] and then generating a tarball that can be copied to the two backup servers, a better solution would be to rsync to the backup servers themselves and then use ZFS snapshots.  Scripts to perform this process have been written and put on [https://github.com/sown/backup GitHub].
  
 
[[has contributor::User:DavidNewman| ]]
 
[[has contributor::User:DavidNewman| ]]
[[has contributor::User::DavidTArrant| ]]
+
[[has contributor::User:TimStallard| ]]
 +
[[has contributor::User:DanTrickey| ]]
 
[[Category:SOWN Project]]
 
[[Category:SOWN Project]]

Latest revision as of 12:06, 20 January 2020

logo-yellow.png

Update Needed
This page needs to be updated

There upgraded backup servers have been redeployed. This page needs to be updated to reflect this.

The SOWN Backup Cloud project is designed as a means of backing up important content and configuration from all SOWN servers, so if there is a failure the server can be rebuilt without losing anything important. It is a cloud as we currently have two different servers that hold backups:

  1. Backup2 - In building 53 level 3 server room
  2. Backup3 - In building 32 level 3 north server room

Currently, the backup service runs once a day, rsync-ing all the files onto a central location (Sown-auth2) and then (gzip) tarballing all these files before scp-ing this tarball onto the two backup servers. Once on these backup servers, all tarballs from the previous 7 days are retained along with all weekly (Sunday) tarballs from the past 30 days and all monthly (1st of the month) tarballs. Periodically older monthly tarballs are manually deleted when the servers start to run out of space.

As rsync is only incremental old files will not be deleted even if they are deleted from their original location. Therefore there is a tidy-backups script that runs on Sown-auth2 to deal with directories where this is a particular issue, (e.g. the remote syslogs captured off nodes stored on Sown-auth2 under /srv/www/sisyphus/docs/).

Planned New Backup Setup

Rather than collating all the files on Sown-auth2 and then generating a tarball that can be copied to the two backup servers, a better solution would be to rsync to the backup servers themselves and then use ZFS snapshots. Scripts to perform this process have been written and put on GitHub.

... more about "SOWN Backup Cloud"
There upgraded backup servers have been redeployed. This page needs to be updated to reflect this. +