Immutable Containers –The New Era of Enterprise Computing

1.png

 

By Kelvin Lim, Team Lead, APAC Solution Architects, NetApp 

 

“The Only Thing That Is Constant Is Change.” Heraclitus, Greek philosopher

 

Although Heraclitus was talking about life in general, he could very well have been talking about the world of IT, where technology is constantly changing, improving and evolving at breakneck speed. For instance, the advent of Cloud Computing and Software Defined Services has changed the enterprise IT landscape and its operating model.  Data Center infrastructure that had once taken months to build in the past can now be setup in the cloud within a few minutes. This is the world of infrastructure-as-a-code. This is the new reality of constant change.

 

The paradox of constant change in enterprise computing is that change can sometimes be a nightmare for systems administrators and programmers, as they are responsible for managing version releases and taking care of the dependencies in a large application – all at the same time. Imagine changing the tires of a car… at the same time as you’re driving it. Often such changes result in human errors and unplanned outages.

 

In my past articles[1], I wrote about the benefits of Containers and how they complement DevOps, microservices and Cloud Computing.

 

Today I would like to share another key attribute of Container infrastructure – Immutable Containers. In short, programmers can leverage containers to create an immutable application infrastructure module. This allows programmers to create a module and leave it unchanged - with no need to update the module.  This means when upgrades are required, programmers need only to create a new image with the new code, and deploy it as a new container image. The old image can be kept in their container registry in the event of a need to roll back the application.  To IT and businesses, Immutable Containers significantly reduce the complexity of rolling out new versions of applications. 

In addition, with the recent advancement in containers management tools such as Keburnetes, containers are now equipped with more enterprise capabilities in areas of availability, scalability and automating the container infrastructure. One of the golden rules of system administration: Automate whatever you can to minimize human mistakes.  We are certainly moving in that direction with Containers. 

 

Enhancing the Container environment with NetApp solutions

 

Existing NetApp customers will be glad to know that the NetApp Docker Volume Plug-In (nDVP) provides direct integration with the Docker[2] ecosystem for NetApp ONTAP platform. With nDVP, IT and businesses are able to capitalize on the features provided by the NetApp storage portfolio and NetApp’s Data Fabric to fully embrace Containers for their applications.

 

NetApp’s vision of the Data Fabric is to make data available wherever and whenever business need it. Data is the most critical asset of any business, but it has mass, which makes it difficult to have the data move at will. A Data Fabric abstracts the complexity behind this, and enables data to move securely to where it needs to be with little effort. With NetApp’s Data Fabric, organizations can increase efficiency, improve IT responsiveness, and ultimately accelerate innovation.

 

Notes: 

[1] http://community.netapp.com/t5/user/viewprofilepage/user-id/34531

[2] Docker is an open platform that automates the deployment of applications within Containers.  With Docker and Containers, developers and sysadmins can build, ship, and run distributed applications, whether on laptops, data center VMs, or the cloud.

 

Data Fabric is NetApp’s vision for the future of data management. It gives you the freedom to seamlessly manage your data across the hybrid cloud. You can move data where you need it most, innovate faster, and always make the best decisions for you and your organization.

data fabric.jpg