Back to School Reading: Have You Ever Wondered Where the Technology of the Future Will Come From?

This summer while you were watching a blockbuster movie, riding your favorite theme-park roller coaster, enjoying the beauty of the outdoors, or flying to your long-weekend holiday, did you ever ask yourself: How does technology make this possible? Probably not. Most often, we take technology for granted. A decade ago, we would have laughed at the idea of what we’re able to do through today’s technology: Instantly update friends and family anywhere in world, stream news or a show from another country on a mobile device, and so on. But with any technology, an idea thought impossible today can become a reality given the right environment.

 

In an effort to better understand where the future of technology will come from, we talked with Al Lawlis, senior director of Engineering Services for NetApp, about NetApp’s recently inaugurated global dynamic lab in Research Triangle Park, North Carolina, and how it will help make the future a reality.

 

The first thing we learned from talking with Lawlis was that data center labs are anything but conventional and that, without them, the future could be full of unwanted surprises. A surprise might be good for parties, but not for business.

“Gone are the days of in-house testing within the IT department or typical data center. Technology changes too fast and the future must be planned for, and, most importantly, engineered for,” explains Lawlis.  

“Moving forward, labs and innovation incubators that can change as quickly as the marketplace will play a critical role in creating the next generation of enterprise solutions. Ultimately, the key difference between a data center and a lab is that a traditional data center is generally static from an engineering point of view, whereas a lab is dynamic, fluid, and can be configured to meet any test and development challenge.”

It is with these posts in mind that NetApp built the new state-of-the-art, energy-efficient test and development “Global Dynamic Lab,” or “GDL-2.” 

For some context, organizations of all kinds are in transition today as business needs change and data growth continues. The advent and ongoing evolution of big industry trends such as the cloud, big data, and software-defined networking are creating new challenges for businesses and IT. For developers and vendors, this means a radical departure from traditional test and development and go-to-market strategies. Because of these sometimes-extreme pressures that businesses face today, the new lab needed to be able to scale and replicate extreme traffic and workloads, as well as support all kinds of future real-world environments.

Preparing for the Extreme

Murphy’s Law suggests that we should “Hope for the best; plan for the worst,” a helpful idea when you were planning your summer vacation or even testing a new solution. The idea of planning and testing against extreme scale and conditions is critical for today’s R&D. And scale and extreme conditions are exactly what NetApp put into GDL-2.

The new 155,000-square-foot facility is built on a high-speed 40Gb/s Ethernet network backbone and includes 2,235 racks of data-processing equipment hosting applications on a shared, tiered virtual infrastructure. The lab employs state-of-the-art automation and self-service capabilities to allow engineers from all over the world to provision and deprovision development and test environments in minutes, dramatically accelerating the development and quality control process through consistent and automated testing. 

The foundation is comprised of one of the largest FlexPod® installations in the world, combined with NetApp® clustered Data ONTAP®, Cisco® Unified Computing Systems, and Cisco 9500 and 9300 Nexus® switches. The scale of the foundation enables test and develop solutions across software-defined networks and application-centric infrastructure, which is important as IT becomes much more software defined.

From a performance perspective, the foundation and backbone of GDL-2 give NetApp the capability to simulate the most extreme test environments IT might face. GDL-2 is capable of running huge I/O loads through traffic-generation testing, allowing NetApp to quickly and easily test against 50,000–60,000 virtual machines in a system-agnostic environment. The entire lab can scale up to support environments comprised of up to one million virtual machines and thousands of NetApp controllers. When coupled with the power of GDL-1, that number moves to almost one million virtual machines. This is possible because of a 144-point fibre connection that allows incredibly low latency—moving data from one facility to another in under 300 microseconds, “or near the speed of light, effectively creating one lab,” adds Lawlis. 

To put some of this in more general terms, Lawlis says that “The combined might of the two facilities can handle the extreme, with enough storage to house about 4,000 years of HD television—more than a few lifetimes of binge-watching staycations—with a staggering 400 petabytes of storage.”

Environmentally First

When designing GDL-2, extreme performance, scale, and flexibility were one side of the coin. The other side was enabling the new facility to live up to NetApp’s ongoing commitment to sustainability and reducing data center power consumption through innovative design. After all, GDL-2 has to live up to GDL-1’s legacy of receiving the first energy star rating for a data center in North America.

To continue this legacy, Lawlis told us about a number of unique design features that help curb energy consumption. The one-of-a-kind louvered window system helps cool the racks using outside air, while the new chiller, HVAC, and cold aisle containment system further cools units more quickly and efficiently than its predecessor. “In fact, the system is so powerful that it can inflate a Goodyear blimp in three seconds with the amount of air moving through it,” Lawlis quips.

The energy-efficient design provides GDL-2 with a power-usage-effectiveness (PUE) ratio of 1.14, which is leaps and bounds beyond the average data center’s 2.0 PUE. In other words, Lawlis proclaims, “GDL-2’s energy-efficiency savings are equivalent to powering more than approximately 15,000 homes per year.”

Innovation Is a Journey

NetApp’s GDL-1 set an innovation benchmark in testing a decade and a half ago, a journey that GDL-2 is well on its way to continuing. “We built something that very few people can conceive of, and it will remain relevant even 10 to 15 years from now,” adds Lawlis. “We are effectively testing customer architectures of the future, which is key not only for our success but, more importantly, for our customers and what they want to achieve with our technology.”

While enjoying life’s simple pleasures or just texting friends, take a moment to consider how data and technology are shaping your fun.