Mastodon
March 14, 2012

The State of Storage in 2012

There is a great article in Information Week this week about the 2012 State of Storage that I wanted to comment on.  If you don’t have a subscription, that’s okay – the basic premise is that SSD costs are really starting to drop and the idea of Enterprise’s using SSD SANs (or higher numbers of SSDs in existing SAN technology) is starting to gain traction.  This is certainly true and will continue to provide great performance improvements for needed IOPS.  It’s telling that to this day, many enterprise, including the 5 that I work closely with, are stuck in the traditional models of storage.  This isn’t their fault – these kinds of sea level changes take time and there are obvious risks to upending a trusted SAN solution.  But the writing is on the wall:  traditional massive storage arrays for both performance applications and archival/compliance/storage requirements are going to look very different in a few short years.

  1. Three tiered storage is not going away.  Local SANs are always going to be needed for certain applications.  Particularly in this day of massive BI needs, the faster those IOPS, the better and SAN solutions built using SSDs in Tier 1 exclusively are going to quickly become the norm.
  2. Structure and unstructured data, to paraphrase from the IW article linked above, are now neck and neck as the leading growth sources in Tier 1.  However, at Tier 2 and Tier 3 – its heavily unstructured and getting more and more so daily.  This calls for a different paradigm with regard to those solutions – this is because of massive ‘gunk’ growing into the environment – data that you wish you didn’t have to keep but you do.  So – instead of spending millions on Tier 2 and Tier 3 – let’s consider alternatives.
  3. Here it comes – wait for it….almost there….TO THE CLOUD! Ahhhh – I feel better:

The Cloud is a perfect repository for unstructured data, or data that has long retention policies around it.  It must be understood, however, that the security and integrity of your data isn’t something that be negotiable.  Wherever your data sits, it must be safe, verifiable and audit-able.  But these constraints do not preclude the use of the cloud – much to the contrary, it calls for the cloud – let me explain:

  1. The ‘cloud’ isn’t all about applications and development – it is also about infrastructure, and Microsoft’s cloud has some really robust infrastructure.  There are tools and technologies on the market today that can take Tier 2 and Tier 3 (and even Tier 1 if you wanna get really crazy!) into Azure without significant changes to your infrastructure.  One such example is a tool I’ve been learning about lately from StorSimple.  50 Tb of Cloud Data for $50,000.  That’s pretty cost effective infrastructure!
  2. Costs will go down and continue to do so – see above but more: In the passed six months, the cost of doing business in Azure has gone down three times.  The reason?  Every time Microsoft hits another one of their sociability targets, they can (and do) reduce the prices for everyone.  I’ve never seen a company do that before – pretty impressive.
  3. Security goes up – Your data in the cloud can be more secure than on prem.  Yes I said it.  Products like StorSimple have attained HIPPA compliance certifications  (and that’s saying something!).  The Azure data centers also have varying degrees of security certifications depending on services used include FERP, ITAR, SAS70, etc.  When was the last time your data center got all of those?
  4. Integrity and availability are critical – can you PROVE that in the event of a data center loss, your data is safe?  Again, the Azure data centers can – your data, encrypted both in flight and at rest depending on your solution, are stored in at least three separate data centers.  I suspect you can’t do that for $0.12/Gb/Mo on your own.  A tool like StorSimple can also be attractive because of the technology it is using behind the scenes that can make your data immediately accessible to your secondary data center in the event of DC 1 loss and you don’t have to pay for that unless you need it.  Not too shabby.

Information Week surveys aside, and they are saying the same thing – consider the cloud carefully for Tier 2 and Tier 3 as there are options for you – you can have enterprise data in the cloud that costs less, is more secure, is verifiable and can plug directly into your existing infrastructure, obviating the need for Tier 2 and Tier 3 to be located on prem in some cases.

It’s a good time to be in the cloud!