By Brendan Wolfe, Sr. Marketing Manager, NetApp
Fall has been a whirlwind. I just got finished with my busiest time of year, from Insight Las Vegas in September to OpenStack Summit Barcelona in October and finally Insight Berlin in November. So, I can say that I have spent the last three months traveling the world talking to people about how they are automating their data centers, eliminating their IT ticketing systems, and adopting containers to deploy their applications.
Throughout this I see a clear trend emerging. People are trying to figure out how much of their enterprise app ecosystem can be containerized. We have touched on this topic before, but we did not anticipate how quickly the industry would ramp up adoption in 2016 as they moved to cloud-based environments.
As people look to take advantage of containerized applications beyond short-term use cases, they need to figure out how to handle data persistence. It doesn’t matter if they are traditional home-grown or monolithic applications migrating to cloud environments, databases looking to simplify deployment environments, or micro-service applications. Data needs to be protected, managed, and perform well. We enabled the NetApp portfolio to help you do that.
We are emerging as the industry leader in container data management, and that’s not just marketing jargon:
How do I learn more?
We have co-authored a whitepaper with Intel and Docker titled How Intel and NetApp Bring Enterprise-Grade Storage to the Docker Ecosystem that was published last week. This paper introduces the basics for building an container infrastructure architecture that hosts a variety of common enterprise applications, Docker fundamentals and an introduction to Docker Datacenter, along with NetApp's approach to container data persistence with the NetApp’s Docker Volume Plug-in (nDVP). I encourage you to read that whitepaper and brush up on the most disruptive technology to hit the datacenter since virtualization.