Tag Archives: backup strategy

Windows Home Server tile

Windows Home Server 2011 OS Restore

Windows Home Server tileAs I mentioned in my last Trail Log, the latch on the drive bay holding my OS mirror broke, and a drive popped out. The mirror worked as expected and things kept running.

I was able to replace the cage and rebuild the mirror. But this brought up the question of what if it was the controller that failed or the OS became corrupt? I’ve yet to test a server restore. It’s a long weekend and now’s as good as anytime to test. If I need to reinstall and manually recreate users and shares I have the time. Recreating shares and users would never be fun, but at least it would be a time of my choosing.

No sense tempting fate and I did check to make sure that the resent backups worked without error which is a luxury an unexpected failure won’t allow. To simulate the failure I’d delete the mirror and  recreate it from scratch including an initialization. While the same hardware, it would be as bad as starting with a new controller.

This is how I have the server backup configured (double click for bigger picture):


The backup goes to an external USB drive and runs at noon and 11PM. The backup has been running reliably and not reporting any errors. So I was ready to check the reliability of those reports.

The restore wasn’t straight-forward but it wasn’t too complicated and it did work. After the restore I had a working server with all my shares and users.

Some tips from my experience:

  • I needed to recreate my install configuration. When I installed I only had the system drive connected. When I tried doing the restore with all drives connected (excluding all but the system mirror as a restore location) the restore process stopped by saying there were no suitable drives for the restore. Removing power from all but the OS drives solved the problem. I suspect this is because with all the drives connected the OS mirror wasn’t seen as the primary drive (just like during an install with all the drives connected).
  • The repair process didn’t always find the system backup on the external drive on the first scan. Sometimes I had to force a rescan then it was found. If the scan was quick I knew it hadn’t looked hard enough and told it to look again. Annoying but it just took persistence.
  • I still needed to load the drivers for the OS Raid Controller before it could be found. So any drivers needed for the original install will be needed for the restore. Although once the restore is done whatever drivers were installed on the server will be used.
  • The restore itself was quick, taking less than 15 minutes once the drives disconnected.
  • The first reboot after the restore failed with a bootmgr not found error. But it’s been fine since then.
  • The times displayed for the image backups (indicating the time the backup would be made) were GMT –8 hours, which is not the same as my server (GMT –5) so the times appeared a bit off until I read the offset and realized why. (Redmond centric I guess)
  • The restore is back to when the backup was made, so any data that changed since then (such as for add-ins) will be lost and have to be recreated.

So, for that last bullet point: Cloudberry saves its information to the C: drive so after the restore I did a repository sync to make sure it was all up to date. But the restore itself worked fine. All my backup plans and repositories were still configured.


I don’t have any other add-ins but if there were any they maintain data on drive C: it would need to be refreshed.

The bottom line – I’m happy to know the server backup actually works. Not that I doubted Microsoft (no, really) but nice to know it works with my hardware with my configuration.

Backup Logo - Laptops connected to backup

Annual Backup Strategy Review

Backup Logo - Laptops connected to backupIt’s been over a year since I last reviewed my backup strategy. I’ve also rebuilt my PC twice in the last month so combined with some other changes it’s a good time to review my backup strategy and the tools I use.

My Backup Philosophy

I’ll start off my repeating my backup rules:

  1. A file doesn’t exist unless it exists in at least three places.  (Was two places last year)
  2. RAID (or RAID like technologies such as Drobo or Windows Home Server) does not mean the file exists in two places.
  3. To be truly protected, two of the three places must be geographically separated.
  4. The backup has to be automatic and unobtrusive. I’m lazy, if I have to manually initiate the backup it won’t happen.

Software Used

The following software provides backup services for me. I keep my files in a central location on my Windows Home Server, so PC backups are not so critical for me. In fact, with my two recent PC rebuilds, both planned, I rebuilt the OS from scratch because I wanted to make changes. While I did restore some application configuration files I don’t restore any data because there is none.

Windows Home Server

In addition to being where my files live, Windows Home Server provides the backup service for my Windows PCs. I’ve occasionally restored individual files or directories and it’s worked well.

I’m currently running both Windows Home Server Version 1 along with the Version 2 (Vail) beta. But because I’m now backing up to beta software I no longer consider these backups reliable, just nice to have.

I keep file duplication on for all shares. As I mentioned, this isn’t a backup, but it does provide redundancy to keep the files available should a drive fail so that I don’t need to go to the backup just to get the server running.

KeepVault for Windows Home Server

I’ve changed the way I’ve used KeepVault since my first review of KeepVault back in May, but I still use it and I suspect I will be buying even more storage. KeepVault backs up my files to the cloud, giving me offsite backup. It meets my requirement of allowing me to pick my own encryption key that they don’t have (assuming I believe them when they say they don’t have, it which I do). But some features of KeepVault keep it from being an all-around offsite solution for me.

  • There’s no archival history for files. If a file is changed it overwrites the previous backed up version.
  • Files are never deleted (unless done manually). Once a file is backed up it remains unless I go in and delete it. For directories where I add/delete/move a lot of files this would become a space problem.

So I’ve moved to using KeepVault for backing up files that don’t change a lot but take a lot of space. Such as Music, Software and some videos.

Jungle Disk

JungleDiskThis is my primary offsite backup software. I’ve been using Jungle Disk since it’s inception and because of that I have been grandfathered into a slightly lower monthly subscription fee than what’s currently available.

Jungle Disk backs up to either Amazon S3 (and the Amazon S3 charges apply) or Rackspace Cloudfiles (and Rackspace charges apply). I’ve moved over to Rackspace as I’ve found it to be reliable. I did stay with Amazon S3 long after Rackspace became available so they could work out the bugs.

I run Jungle Disk on my Windows PC but back up the shares on my Windows Home Server.


Chronosync is a Mac app that can be used to move files around, and as the name suggests, synchronize folders. I have a Drobo attached to my Mac Mini and I use Chronosync to make a copy of files on my Windows Home Server to the Drobo. This gives me my second local copy of these files.


I pay for a 50GB DropBox account along with the “Packrat” option that saves deleted files and previous file versions. I mainly use this as a way to share files between my PCs, phone, iPad and to make files available via the web. But the offsite storage allows me to use it for some backups.


I use WinSCP to backup my web server using a script run as a scheduled task.

OS X Time Machine

I have even less true data on my Mac than I do on my PC. Almost everything is saved on the Windows Home Server or on the attached Drobo. If it’s on the Drobo it gets copied up to the Windows Home Server at night.

Still, I use Time Machine to get setting changes and to allow a quick rebuild if necessary.

How The Backups Happen

Purchased Music & Purchased Video

I have a share on my Windows Home Server called Archive. I have a ChronoSync job that copies purchased music and video from my Music library to the Archive share. From there KeepVault backs up any new files. Since I rarely delete these files the lack of KeepVault deletions is actually a benefit as an unnoticed accidental deletion from my library can be restored when it’s finally noticed.

I only do this for music and videos I’ve purchased online and downloaded.

Ripped Music and Video

This is by far my most problematic set of files since there are around 3 TB worth of files. For these I use Robocopy and some batch files to synchronize them with a set of hard drives I have. I keep these hard drives in the office so they are out of the house. They aren’t critical data so a delay in getting them may be annoying, but not a real problem.

My third copy of these files is the original physical media packed away in boxes. It would be annoying and time consuming to re-rip them. I’ve accumulated enough older hard drives I may be able to set up a second set of hard drives with a copy to keep in the house.

I have had to restore these files due to a past server failure and the robocopy strategy does work.

Web Server

I run a scheduled task that fires off a WinSCP script to download my website files to a folder in my DropBox. This gives me a local copy across PCs along with one in the cloud. The DropBox Packrat feature also keeps old file versions available should I need them.

Windows Home Server & Dropbox Data

I keep copies of software I purchase or use in a share on the WHS. If I received the software on CD/DVD I make an ISO file of the disc and save it to the server. These files get backed up to KeepVault. While I may eventually want to delete these files (something I need to do manually with KeepVault) I want to keep most software long after I stop using it, just in case.

Everything else gets backup up with Jungle Disk. I have a user share with my critical (mostly financial and records) data and that gets backed up to both Rackspace and Amazon S3. Both of these services use an encryption key I specify and only I know.

Since space is money I’ve changed the time to keep replaced/deleted files from the default 60 days down to 10 days. My main concern is accidental deletions and hopefully I’d notice within 10 days.

As for the second local copy of these files – the DropBox files duplicate themselves. I use ChronoSync to copy the remaining WHS files to my Drobo as a local backup.

Picture of the Sidekick not available message

Microsoft Kills the Sidekick

Picture of the Sidekick not available messageThis story has been kicking around a couple of days – T-Mobile Sidekick users appear to have lost all their personal data. Sidekick is a Microsoft product (through Danger, which they bought) and the data is on their servers, not T-Mobile’s.

Now the rumor of choice is that it was a failed SAN upgrade and Microsoft didn’t have a backup strategy so there weren’t any backups. It’s hard to imagine this happening at a company like Microsoft. We don’t expect them to be run like a fly-by-night operation. Yet they didn’t have backups of their customer data, the data their business was built around. Then, without backups, they brought in a 3rd party vendor to upgrade the SAN that all that data was on. For some reason it went bad. But instead of being an annoying outage while a restore was done it’s turning out to be a disaster.

They’re still trying to recover some of the data but it doesn’t seem promising. T-Mobile is telling customers not to turn off their Sidekicks or let their battery run out in the hopes that the data that’s on them can be saved. T-Mobile has also stop selling the Sidekick, at least temporarily.

Microsoft paid a reported $500 million for Danger in early 2008. Now it appears they’ve come close to destroying it

Picture of a black umbrella

Reviewing My Backup Strategy

Picture of a black umbrellaThe days of using a few floppy disks to backup important files are long gone. In going through my website I realized that the information about my backup strategy was a bit dated and didn’t reflect how I do things. Going through everything in order to write this article would also help me be sure I had a bulletproof backup strategy. The goal is to have a solid backup solution so I never have to use a file recovery tool like Data Rescue II. So here goes…

My Backup Philosophy

My backup philosophy still hasn’t changed:

  1. A file doesn’t exist unless it exists in at least two places
  2. RAID (or RAID like technologies such as Drobo or Windows Home Server) does not mean the file exists in two places.
  3. To be truly protected, the two places must be geographically separated.
  4. The backup has to be automatic and unobtrusive. I’m lazy, if I have to manually initiate the backup it won’t happen.

That third and fourth points can be a problem for large groups of files such as my music and video libraries which total over 5TB of data. But I’ll deal with that later.

I also like to have two copies of the file locally (including the “live” copy) to allow quick restores, although this isn’t possible for my large video library.

Backup Software (and Services) Used

I make use of the following software and services in my backup strategy:

Jungle Disk (with files saved to Amazon S3) – This is my primary offsite backup software. I like it because it’s cross platform (Windows, OS X, Linux). Since pricing is based on usage it costs me a bit more than $5/mth which is typical of many other backup plans. But it’s been reliable so I consider it money well spent.

Windows Home Server – I have file duplication enabled for all files on the server. The WHS software includes backing up PCs and this is my primary in-house backup for my Windows PCs.

Jungle disk and Windows Home Server are my primary backups tools but the following software also helps out.

Drop Box – Another free service (for 2GB of space). Drop Box allows files to be synced between PCs. I primarily use it to sync settings between computers along with some files but in addition to putting the files on multiple PCs copies are also save on the web so it counts as offsite backup. As an added bonus it saves archives of deleted and changed files for 30 days.

I use ChronoSync this to sync files to my Windows Home Server. I use it to move files between my Mac and Windows Home Server as a way of having local backups. On the Windows side I use Microsoft’s SyncToy to sync between my Windows PCs and my WHS.

MozyHome Free – I use the free version of Mozy to backup some Windows files. I used to use the paid version on my Mac but after some serious problems with Mac Mozy I dropped the subscription. I use the free service for three reasons: It’s been reliable on Windows, it’s free, and it keeps historical copied of changed/deleted files for 30 days. It’s that last item that’s important. There are a few files that I want to keep historical copies of for awhile. With Jungle Disk I’d be charged for the space these historical copies use and I’d have to set up a separate backup in order to treat the files differently. The reality is I’ll probably never restore archive copies from my offsite backup, after all I have Windows Home Server. But Mozy’s free and unobtrusive so I’ve kept it.

I also use Apple Time Machine and SuperDuper! (I hate that exclamation mark and will drop if from this point on) on my Macs.

Jungle Disk and Mozy share a security feature I want in my offsite backup. I supply my own encryption key and the files are encrypted on my local computer before being sent to the “cloud”. Neither Jungle Disk or Mozy have copies of my encryption key so if anyone gets access to my files they still can’t decrypt them.

The offsite backups exist for those cases I really hope never happen – from natural disasters and fires to someone breaking in and taking my computer and nearby drives. Except for those cases I’ll be counting on the local backups.

iMac Backup

My iMac (running Snow Leopard) has my iTunes music library and an iPhoto library although both are on a external Drobo drive connected to the iMac. While providing RAID like protection, as point 2 states above, it’s not a backup. Other data files are either synced via Drop Box or saved on my Windows Home Server. I really don’t save miscellaneous data to my iMac anymore, they get saved directly to the server. But any miscellaneous files would be saved to my documents folder which Time Machine would catch.

I use Time Machine to back up my iMac’s system drive. There’s not much that changes, mainly settings, so I created a 500 GB partition on my Drobo. This will limit Time Machine to 500 GB as a maximum but because of the way Drobo works with multiple partitions it only uses the space actually needed by the files. At this point that well under 100 GB.

Every night SuperDuper clones my system drive to a second partition on the Drobo. This is really a holdover from the days when it was cloned to an external drive that I could boot off of in an emergency. It did come in handy during my recent iMac rebuilds, making it easy to find and restore an apps settings one at a time. If disk space was an issue or I didn’t already have it I wouldn’t be doing this backup anymore.

As for my offsite backups, which are done using Jungle Disk, I just back up:

  • Documents folder (which has very few files)
  • A software archive folder which has copies of any Mac software I paid for or is critical even if it’s a free download.
  • My Preferences folder
  • My Application Support folder since some applications may put data in these folders

All of this totals about 2.5 GB, most of which is the software.

As for my music and photo libraries which are on my Drobo drive – ChronoSync runs every night and syncs the libraries up to a drive on my Windows Home Server so they’re duplicated. ChronoSync takes care of any adds/deletes/changes automatically.

None of this requires me to do anything beyond the occasional check to make sure this are still working, which is perfect.

Windows 7 Machine Backup

Back when my iMac issues started my Windows 7 PC became my primary PC so that I’d have time to rebuild and test my iMac. I’ve come to really like Windows 7 so my home-build Windows 7 machine has become a daily worker.

The PC is backed up daily using Windows Home Server backup. I have it set to keep the last 10 daily backups, the last 52 weekly backups and the last 12 monthly backups. Yea, there’s a lot of overlap there but since Windows Home Server is efficient about disk space usage it’s not a huge disk hog. I do exclude one directory that I use for temporary storage of large files that I don’t need backed up (such as DVD images). Windows Home Server backup keeps just one copy of the file and the other backup sets just point to the original. So while my backup may be 84 GB it’s not 84 GB for each of my 74 saved backups (10 + 52 + 12).

Restores are straight-forward through the explorer like interface. I’ve never done a full restore using a WHS backup since I’ve always wanted a fresh OS install but I’ve never had a problem restoring from the WHS backup.

I use Jungle Disk to backup important files on this PC along with network files that are important. These get saved on Amazon’s S3 service. I don’t have Jungle Disk set to save archive copies of changed/deleted files, that’s what WHS is for. One backup job is set to backup local files from the PC, another job is set to backup files from my Windows Home Server (via shares). Since I save most data (and anything important) to my WHS the network backup is by far the largest of my Jungle Disk backups and is currently about 20GB. This offsite backup includes my photos but doesn’t include music or videos.

There are backup solutions for Windows Home Server but I prefer to do it via share using my existing backup solution. I can manage backups from my PC and it takes the processing load off the server. The only issue I have with Jungle Disk is that every time I reboot my Windows PC I have to go into Jungle Disk and “test” the network connection, otherwise the backup won’t connect to the shares. It’s frustrating since I don’t actually have to change anything, just click the test button for one share and it fixes the problem for all shares.

I have Jungle Disk send me a daily status email with the backup results so if any backups are missing or show errors I can fix them.

Website Backups

In the past I had used Transmit to synchronize the files from the server to my iMac using SFTP. When my iMac went bad I switched to using WinSCP on my Windows 7 PC. The theory is the same on Windows as it is on my Mac, just different software. I already wrote about how I schedule the WordPress database backups on the server so the database gets backed up to my local PC too.

WinSCP syncs the files down to my Windows 7 PC. From there SyncToy is used to make a local copy to a Windows Home Server share. The Windows Home Server backup also backs them up as a part of the regular PC backup. This regular backup also keeps a years worth of archive copies.

Jungle Disk also includes the web files when it backs up the files from the PC, this gives me my offsite backup (although considering the web server itself is offsite this may be redundant – but I’m paranoid).

The Big Backup

My music and video libraries are just too big (over 6 TB) to backup over the Internet. So to back those up and keep them offsite I copy the files to older hard drives and keep them at the office.

I have a eSata docking bay attached to my Windows PC. I plug the bar SATA drives into this toaster like dock and use robocopy to copy files to the drives. Once a video is added to my iTunes library it doesn’t change when I watch it so these files only show as modified if I change the meta data or re-encode the video. I just created batch files that used robocopy to copy files within a date range that would fit on the drive. I have one batch file per drive so when it comes time to refresh the backup I just rerun the file. An example robocopy command line is (all one line):

robocopy \servervideos f: *.* /S /COPY:DAT /MINAGE:20081129 /MAXAGE:20080908 /PURGE /XA:H

This copies all files from my video share to the drive plugged into the dock. The /S copies subdirectories. The /COPY:DAT says to copy data, attributes and timestamps which are supposed to be the defaults but I had some issues in testing so I specify them. /MINAGE says not to copy files with this date or newer (so in the example, files dated Nov 29, 2008 are not copied). /MAXAGE says not to copy files older than this date (so in this example, files dated Sept 8, 2008 are copied). /PURGE will delete any files within the date range that are no longer on the server. The /XA:H switch says to exclude files with the hidden attribute, such as the Thumbs.db file Windows likes to create since I put DVD art in the directories.

One thing I need to keep in mind is I need to spin up the disks a couple times a year at least. I started off with 500 GB drives being used for the backups since that’s mostly what I had as spares or had the best price per GB at the time I bought them. After removing the drive I start the backup process over again and backup to all the drives again which spins them up and rewrites the data. I put the new drive at the beginning of the backup so a drive isn’t overwritten until it’s files were already backed up.

This process can take a couple of days. While the file copies do take a lot of time most of the delay is because the current drive never seems to finish when I’m around to swap it quickly. So there’s usually a few days when my backups are in the same location as the originals but there’s only so much I can do on a budget.

I keep the latest drive in the dock and backup new videos to it. At this point they aren’t offsite but I figure the risk is minor.

My music library gets backed up in a similar manner although since it easily fits on a 500 GB disk, and I now have two  of them available, I just keep swapping the drives between home and the office every month or so. More often if I’ve added a lot of new music.

Due to the size of my video library I only have one copy of the files locally, but that’s on my Windows Home Server with file duplication enabled. I’ve considered turning off file duplication and moving some of the drives to a different computer so I could have a true backup. But I’d need to build a second server (or have multiple locations) to handle 6 TB and a restore would take a long time. I’d rather have file duplication on so as to handle a bad drive, rather than have to do a file restore when a drive goes bad. Especially since I couldn’t be sure what files were on that bad drive. If the entire server craters the delay needed to get the files from the office is small compared to the amount of time to rebuild and restore. I’ve already had one bad drive which had minimal impact thanks to file duplication.

Misc Backups

My MSI Wind netbook runs Windows 7 and gets backed up using the Windows Home Server backup. I really don’t save files locally on the Netbook but if I do they get saved to my Drop Box folder so get synced to other PCs and saved in the Drop Box “cloud”.


Windows Home Server is the main component of my Backup Strategy. It handles the daily backups along with weekly and monthly archives. Although earlier versions were buggy the current version has never given me trouble when it’s time to restore a file. Jungle Disk handles my offsite backup needs.

I do have one hole in my backup strategy, namely I have no backup for my email which is through Google. At one time I simply copied information I needed and lost email meant nothing. Eventually I’ll have to plug that.

Backup Paranoia Pays Off

I’ve always been paranoid about backups, a file doesn’t exist unless it was in two places. And then there’s off-site backups which add a third place for important or expensive (time & money) files.  Things like RAID and file duplication (in WHS) don’t count as two places. If a RAID or disk controller failed the files could be lost. RAID and file duplication are good against drive failures and allow you to keep working, but they aren’t a replacement for backups.

This week I got a object lesson that proved the paranoia was justified. I have over 3 TB of files on my Windows Home Server. Due to their size I don’t use file duplication for most things, but I do back everything up. I was adding the external drives to get more space.

Long story short – that new external rack was bad, but not so bad it wouldn’t run, and I’m now restoring the entire server from backup. Even files that were duplicated were corrupted since both copies (probably) ended up on drives in that external rack. It appeared that files on the original 4 internal drives were OK but I decided to do a complete server rebuild anyway. I’d had so much corruption and other problems that I decided to flatten the server and restore the files. So, after about a day, I’m about 1/3 of the way through the restore. Luckily I had recently bought some cheap hardware to make the restore process a little faster than it would have been.

The irony of this was one reason I was adding the drives was to enable file duplication for more files and reduce the risk of having to do a full restore since it would be so time consuming. I was also replacing a dual USB drive since I was worried USB would be a little unreliable. So after trying to improve things I’m back to the USB which had been reliable for the month or so I was using it.

Now it’s time to get back to checking those file restores.