When it Comes to NetApp, Big Data Has a ‘Big’ Impact

By Dave Ellis, NetApp Professional Services Consultant - HPC Architect, U.S. Public Sector, Technology Solutions Team

 

I recently attended the HPC User Forum in Dearborn, Mich., a two-day conference consisting of a variety of presentations from Government, Education and Industry representatives covering specific HPC Topic areas. While there, I was invited to participate in a Big Data and Analytics vendor panel titled “How Vendors See Big Data Impacting the Market and Their Products/Services.”

 

 

You’re probably no stranger to the fact that NetApp defines Big Data using the “ABCs” – Analytics, Bandwidth, Content. We have solutions and product innovations for all three of these workloads, and while we’re still in the process of determining which one of these is most important for NetApp, I’m not sure we’re convinced that any one is more – or less – important than the others. We have a variety of solutions in the market now with Hadoop, Lustre, Video Surveillance and StorageGrid as the enabling components, although we continue working with partners on evolving these solutions capabilities for our customers.

 

But how do we set ourselves apart in the marketplace when it comes to pursuing these Big Data opportunities? NetApp is focused on faster to time results in easy to manage, easy to use and easy to integrate systems. We provide customers with a single vendor who is able to handle the highest performance, highest density and best-in-class enterprise systems in the market today and in the future. In many cases, these are delivered by Partners, ISVs, Federal Integrators and Reseller Channels as pre-configured, pre-tested environments that are easy to install and ready to use.

 

 

During the panel discussion, we were asked if today’s current Big Data hardware and software tools are sufficient enough to tackle the important Big Data opportunities. The simple answer? No. From the storage subsystem perspective, NetApp is working with its partners on improving various aspects of purpose-built file systems to address any elements of concern – for example, improving the performance and reliability of Hadoop environments for NetApp-provided solutions – in addition to addressing I/O infrastructure issues within HPC workloads.

 

 

The biggest question on the table is, however, where the future of Big Data lies. We’ve seen that the high-end commercial analytics market and the data-intensive HPC market are colliding, most notably as customers inquire about deploying Lustre as the underlying file system for Hadoop suites to run analytics on. Going forward, NetApp believes that every aspect of computing will become more data-intensive. The “keep everything, forever” approach will most likely continue, but tools may also be developed in an effort to remove the “noise” from these collections.

 

 

So what do you think? How have you seen Big Data impacting the market and where do you see it five, even ten years from now? Let us know in the comments below and take a closer look at NetApp’s Big Data solutions here.