Friday, July 31, 2009

DeDuplicating Data Backups

I have been trying to evaluate and find a better disk-backup solution and had done some intense research on the products that meet my requirements.

Now, this is not a comparison of all the de-dupe products out there - just only the ones that I had qualified to meet my requirements. And, talking about requirements, here are they

  • Should be able to scale to at least 100 Terbytes
  • Should be able to have a global de-duplication engine that looks over all of the 100+ TB
  • Should be able to integrate into Symantec's NetBackup OST - if you are not sure what OST is, I will discuss that in another post.
  • Should be able to replicate only the changed blocks with compression
  • Should be able to de-dupe daily backups of at least 50+ TB within hours and have that offsite within 24 hours
  • Should be able to make use of high density SATA drives as backend
  • Preferably be storage agnostic (not a major requirement)
  • Should be able to withstand appliance failures and still do the intended work
  • Able to replicate over any medium i.e FC or IP
  • No/minimal/less impact of restores i.e no performance impact for re-hydrating / un-deduping data
  • Dynamic capacity addition and maintenance
I will add more to the post with the various vendors that I looked at, their features and more soon...

No comments:

Post a Comment