• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Backup strategy for work environment - suggestions?

Joemonkey

Diamond Member
The person currently in charge of backups is moving on to another area, and I am being moved into being in charge of backups.

I'm told I can redo everything if I want as far as schedules and such, but it's been quite a while since I've messed with backups, and when I did I was only backing up something like 500GB total weekly. Currently we use BrightStor ARCserve, which seems to me like it is a PITA as I am used to Commvault. Every day I check out the daily log and put benchmarking information (MB/min, total size, total time, etc) into an Excel spreadsheet, as well as which tapes were used. I'd like to change this into a report I can run from the software, but the reporting is also a PITA to customize.

We are using an LTO3 tape library with 2 drives, no compression turned on. The backup schedule has all email and application servers backed up 7 days a week, full backup. They total about 1TB. The main file server is backed up full on Friday, then incrementals Mon-Thur. A full file server backup is about 2.5TB. This backup takes ~45 hours to run.

Every Monday I take the Friday full backups, which ends up being ~8 tapes every week, to a vault in another building on campus. These tapes are rotated back in as a scratch set about every 5 weeks. Monthly tapes get taken offsite to another facility ~30 miles away and are never brought back.

What would be the best way to increase backup efficiency? As far as things I can do that do not cost anything, I thought about turning on compression and running some benchmarks to see how much extra time it adds. Not much else I can think of.

As far as throwing $ at our problem, I was thinking about trying to either get another NAS server dedicated to backups that everything is backed up to first, then back up to tapes from there. Or getting an LTO4 library and tapes, but that is talking lots of $$.

Any suggestions would be greatly appreciated.

EDIT: Oh, and if this is the wrong forum just move it, I wasn't sure which forum to post this in (we really need a network admin forum for things like GPOs, AD, backups, etc.). The only reason I chose this one was because I found this thread in here
and it wasn't locked or moved.
 
Just got word from my boss that we have about $32k in the budget for backup system hardware upgrades. This does not include tapes as they are overhead.
 
Originally posted by: Joemonkey
What would be the best way to increase backup efficiency? As far as things I can do that do not cost anything, I thought about turning on compression and running some benchmarks to see how much extra time it adds. Not much else I can think of.

Backup efficiency really consists of 2 things. Reducing the amount of data you're backing up or increasing the speed at which you can backup. If possible, you should try to find where your bottlenecks are on your network/SAN and also look for ways to reduce the amount of data to backup through policy, schedule, software, etc. I do want to caution you from focusing too much on the "backup efficiency" as the real goal of the whole DR process is "recovery/restore efficiency". Try to strike a good balance.

Turning on software compression (which usually runs on top of any hardware compression) has never helped me. It usually has very little impact, and theoretically it could actually increase file sizes by a tiny bit in absolute worse cases.
 
Originally posted by: RebateMonger
AnandTech

There was a discussion of this exact topic about two weeks ago on this same Forum.

Did you read my edit to the OP? I linked to that exact thread. The backup size, hardware, and budget make this thread a very differnet animal.

Originally posted by: RebateMonger
I'm curious.

What kind of process do you use to test the backups?

There are people that delete things they need at least 1-2 times a week, so we do restores at least that often. As far as a full disaster recovery, as in rebuild a server using system state and such, we test that maybe 2x a year.
 
This won't really solve your backup problem, but as for people deleting things they need on a regular basis, what about running a nas with snapshots? That way you don't need to do restores unless someone deletes something and forgets about it until much later. For the most common cases they can just reach back and get the files themselves. Sounds like you are contemplating NAS already, might want to add some features to your shopping list.
 
Originally posted by: alpineranger
This won't really solve your backup problem, but as for people deleting things they need on a regular basis, what about running a nas with snapshots?
Windows Server 2003 and higher already have automatic "Previous Versions" snapshots of files on shared folders. Users can restore previous versions, deleted, and overwritten files from their desktop. How far back you can go depends on how much disk space you devote to the snapsots (Volume Shadow Copies).
 
What type of efficiency gains are you looking at? Can you do BMR (Bare Metal Recovery) currently? Are you looking at shrinking your backup window down?
 
Back
Top