To Be or Not to Be – The Realities of Data in the Cloud Today

By David Gingell, Vice President of Marketing, EMEA, NetApp


Hamlet, Prince of Denmark, certainly experienced his fair share of angst and self-doubt. I am no Shakespeare scholar but if the prince's famous soliloquy in Act III is anything to go by, Hamlet was certainly a little mixed up about knowing what to do about himself and his uncle. Hesitation proved to be a killer in the end, and Hamlet's failure to take a sword to Claudius led to the tragedy that unfolded in the play. Perhaps his epitaph should be, "Hesitate and you die."


This could be the mantra of CIOs these days. Hesitation, prevarication and failure to act may prove fatal. Today, CIOs are under tremendous pressure to deliver innovation through IT to their internal clients, and to do so while cutting costs and operating in the most transformative times in computing since the process began 70 years or so ago.


CIOs are bombarded by vendors, consultants and integrators who present their products and services as transformational – usually through some form of cloud paradigm. Their line-of-business executives and their CEOs also demand to know what they are doing in the cloud and how they are going to harness its possibilities to stay one step ahead of their competitors.


If he or she hesitates, prevaricates, or fails to wholeheartedly embrace the new architectural models offered by the cloud, the CIO may be put to the metaphorical sword.


"Do you see yonder cloud..." (Hamlet, Act III, Sc. 2)


Yes, cloud computing holds great promise, and most CIOs embrace it in one or more of its common forms: private, public, or hybrid clouds. Some applications or workloads have so far proven more suitable to be taken as a cloud service by both service providers and within enterprises. The more frequent use cases we see include VDI, in-memory data analytics, SaaS environments and core file services.


A secure multi-tenanted virtualized infrastructure operated by a service provider offers a platform for many hundreds of customers and their workloads. These are no longer tier 2 applications but mission critical, and they demand the same level of robustness and availability that they would have had if running in a proprietary, homogeneous data center on bare metal. The CIO expects this. Capabilities such as nondisruptive upgrades of the infrastructure, security issues and data availability must be addressed by service providers.


So the pressure is on, both for the enterprise CIO and in the highly competitive world of the service provider.


"How noble in reason! How infinite in faculty!" (Hamlet, Act II, Sc. 2)


Is moving to the cloud a leap of faith? The simple answer is NO.


Plenty of examples of PaaS, Iaas, and SaaS implementations show that. However, they do tend to be based on a specific architectural cloud model and have a heavy data-gravity profile. That is, the workload operates in a defined cloud paradigm – public or private.


As a result, it is becoming increasingly obvious that the power of the cloud will be fully realized only if clouds are not siloes on which workloads are trapped, but are fluid and interoperable, allowing data to flow seamlessly within. This is what hybrid clouds are all about.


However, in the excitement of moving workloads from one architecture to another and back again (for example, private -> public -> private), the conversation about the data has largely been overlooked. Let me be specific. Data has gravity. Data in the cloud needs to be governed. Is this a difficult part of cloud computing?  Simply put, it is rather problematic.


Moving vast amounts of data to support workloads from one cloud platform to another is quite complex. It requires a universal data platform that can speak multiple cloud protocols and interact with multiple cloud orchestration tools. It requires a data management platform that is built to handle this, such as NetApp's newly updated clustered Data ONTAP v. 8.2.1 operating system.


For example, data management platforms take into account that as workloads move around from one cloud fabric to another and hybrid cloud technologies (such as VMware's vCHS or Cisco's InterCloud) emerge to facilitate this movement, the data needs to move seamlessly as well. The protocols, access controls and virtualization layers are often different between the different fabrics, and this is where clustered Data ONTAP comes in.


Clustered Data ONTAP allows scale-out operations and consistent data management across both public and private clouds, and it does so nondisruptively. It unifies SAN, NAS, and storage virtualization by running across the award-winning FAS systems. In conjunction with recently launching the latest version of the OS, we also announced the availability of the FAS8000, the new top-end storage platform, and FlexArray virtualization software. Together, the new solutions allow data to seamlessly flow from public to private clouds and back again.


"This above all: to thine own self be true..." (Hamlet, Act I, Sc. 3)


Improving efficiency and the overall control of data is but one need CIOs and service providers must meet. Equally important is controlling cost. In days gone by, when a company looked to ascend to the cloud, a major investment was required to rip out existing infrastructure and to replace it with the new systems necessary to operate in a hybrid world. Am I suggesting that all legacy cloud storage systems be ripped out and replaced? Not in all cases. In some, too much investment was made in storage platforms, and that needs protecting and the assets sweated further.


Thankfully, to help the CIO and service provider manage cost, FlexArray virtualization software enables existing legacy storage to be managed by Data ONTAP at the software layer. It brings the advanced features offered around storage efficiency and cloud storage integration to a modern cloud architecture. In fact, we can show a nine-month payback period on the management of some third-party arrays.


CIOs being tasked with embracing and exploiting the opportunities afforded by cloud computing paradigms have certainly taken initial steps and have seen significant benefits in terms of cost savings and reduced complexity. However, the big prizes will go to those who adopt hybrid models and can deliver the sort of computing resource flexibility that is needed today.


NetApp believes that the CIO's role is moving from being a builder and operator of data centers to becoming a broker of services. This will happen conclusively only when the flexibility afforded by hybrid clouds is realized. For this to take place, CIOs and service providers need a universal data platform that is highly scalable and totally resilient. Only NetApp delivers this platform today.

Now is not the time for CIOs to hesitate or prevaricate. The future is finally here. CIOs need to embrace it and metaphorically avoid the type of fate that befell the Prince of Denmark.