NetApp: The Originator of Software-Defined Storage (Who Knew?)

By Dr. Mark Bregman, SVP & Chief Technology Officer, NetApp

 

The opportunity for NetApp to join the Open Compute Project (OCP) program this year as the only storage and data management systems provider is a real privilege. It also completes a circle for NetApp as a leading voice for the role of software in an open hardware ecosystem.

 

As founder Dave Hitz has often noted, NetApp at its inception 25 years ago was originally a software company and we invented what became known as Software-Defined Storage (SDS). At the time, we got into the business of hardware customization because as we scaled our system in performance, capacity, and reliability, we encountered limitations in standard servers, networks, and storage media. We got very good at both the hardware and software sides of the business as a result and that enabled us to deliver highly reliable, fast, and simple storage solutions.

 

Such remarkable innovations arose from our development of enterprise storage and data management systems, each an important contribution in its time. For example, managing vibration within a rack of spinning disks has been a big deal to our data center customers, and we happen to be especially good at it. Now, with the emergence of the all-flash data center, that issue is going away.

 

On the other hand, our accumulated expertise in storage virtualization, hypervisor optimization, cloud and hybrid cloud computing, hardware abstraction, and managing data wherever it resides is more relevant today than ever. That’s especially true in the context of designing and building optimized and sustainable data centers for the future.

 

I mentioned the introduction of all-flash storage in the data center, which is one important factor in data center design. Other factors and trends include changes in workloads and data types, the growing role of the cloud, and the emergence of data as currency in the digital economy. Amidst these trends, we’re witnessing the arrival of commodity hardware that’s now capable of fulfilling some of the needs of the modern data center.

 

With the gap closing between what the data center needs and what commodity hardware can deliver, companies like NetApp are free to provide value through the delivery of software and data management services. It’s an exciting time. While we continue to provide the best engineered systems on the market, our software advantages are coming to the fore.

 

This is especially true in the context of the OCP data center as originally conceived by Facebook and joined by IntelGoldman SachsRackspace, Microsoft, AppleCisco, Juniper Networks,  Nokia, Lenovo, Google and others. In an OCP environment, NetApp software runs better than anyone else’s. Our Data Fabric gives IT organizations ubiquitous control of all their data, no matter where it’s physically located and no matter what applications are creating or accessing it.

 

NetApp has several software products that are ideal for open computing environments. SolidFire Element OS software enables customers to purchase the hardware to run a shared-nothing scale-out storage system separately from the software. Likewise, ONTAP Select software functions separately from the underlying hardware. AltaVault and StorageGRID Webscale are specialized NetApp solutions that function at the Hardware Abstraction Layer (HAL) and are available in both integrated and software-only versions and in hyperscale clouds.

 

With the ability of our software to enable the data-driven enterprise, we see the virtualized storage system as effectively a “data operating system.”

 

Evolution from Hardware-focused, to Software-defined, to Data-defined Storage

 

Evolution from Hardware-focused, to Software-defined, to Data-defined Storage

When we started NetApp, we quickly evolved to deliver engineered systems that combined commodity and custom hardware with innovative software. Our emphasis on software capabilities continues as data assumes a more and more prominent role in enterprise IT.

 

As enterprise customers, hyperscale cloud providers like Azure and AWS, service providers, systems providers, and technology partners engage in designing and building more sustainable data center solutions, we at NetApp embrace the OCP standard. Our history, expertise, and the NetApp Data Fabric approach position us better than any other provider to enable the data-driven enterprise in the context of open computing.

Comments
Member

Well put draft. Thanks for writing.

Great post, Mark!

 

I think a key advantage to Data ONTAP over the years, and especially in terms of clusterd Data ONTAP, is the manner in which all features work seamlessly with each other, and all support non-disruptive operations.  Just three examples from the data protection portfolio alone:

 

  • A hardware upgrade for a system running MetroCluster synchronous replication can be performed non-disruptively.
  • SnapLock WORM storage fully supports storage efficiency at the block level. 
  • If you set up a data protection policy to replicate from one node to another in a cluster, you can move the source and target volumes during replication.

 

I actually wrote a 5-part series of blog posts on the subject back in 2013, when SDS was just slideware for everyone else:

 

Part 1: https://community.netapp.com/t5/Technology/Software-Defined-Been-There-Done-That-Part-1/ba-p/84431 

 

New Member

Nice post Mark, well done. While OCP is truly about draining as much cost out of hardware and therefore having software install on just about anything x86 I'd constrast that from Software Defined.  If you look at this years VMworld keynote they point back to Raghu Raghuram's, SVP of SDDC at VMware, orginial quote on Software Defined it goes as follows: "Increasingly, all the infrastructure components that we know of will be developed and deployed. Even more importantly, the control of this data center is entirely done by software.  The data center is on its way to becoming programmable and automated." VMware 2011.

 

So I gather from that the real story in Software Defined is capabilities, programmability (API's) and automation. For some reason that takes a backseat to software installing on any old x86.