Since its release in 2021, the NetApp DataOps Toolkit has been quietly transforming the way data teams and organizations manage their enterprise storage. What started as a library to simplify provisioning JupyterLab workspaces and inference servers has quickly evolved into something much bigger. With every release, the toolkit has pushed its boundaries by making it effortless to clone massive datasets in seconds, snapshot entire environments, and provision storage at scale with just a few lines of Python.
The world of AI has been moving at breakneck speed. With the rise of AI Agents and agentic AI workflows, the way we build and orchestrate intelligent applications is changing. Enterprises are no longer content with just training a model or deploying an inference server. They want agents that can reason, act, and collaborate. Customers want orchestrated workflows that connect these agents into a decision-making system.
This is where NetApp is stepping in with something exciting: a custom MCP server built with the DataOps Toolkit that acts as the blueprint for orchestrating NetApp volumes. Think of it as your intelligent storage assistant, seamlessly integrated into an agentic workflow, ready to handle tasks through simple chatbot commands.
Let’s understand how this new approach works, why concepts like agents and agentic workflows matter, and how frameworks like LangGraph make it all possible.
Understanding the Shift: AI Agents vs Agentic AI Workflows
When the DataOps Toolkit was first introduced, its purpose was clear: simplify the lifecycle of managing data volumes, workspaces, and inference servers. And it did this brilliantly. But as AI systems matured, expectations grew. Could these operations be automated end-to-end? Can we integrate them into intelligent workflows? Can we create an ecosystem out of it? The answer is yes. And that’s what the MCP server unlocks.
Before we dive into the orchestrator, let’s understand some key concepts:
- AI Agent: Think of it as an enhanced LLM that perceives inputs, reasons, and takes actions to achieve goals. It works as an individual worker.
- Agentic AI Workflow: Instead of a single AI Agent, it’s a structured orchestration of multiple agents (and sometimes humans) working together to solve a complex task. Think of it as a factory line where many workers collaborate on the pipeline. For example, one agent retrieves data, another analyzes it, and a third decides what action to take next.
Why do we need a framework?
These agentic workflows can get complex - they need orchestration, state management, and error handling. That’s where LangGraph comes in. LangGraph is a framework purpose-built for agentic workflows. It allows you to design a graph (or flow) of interconnected agents that can:
- Retain memory across steps.
- Call external tools (like MCP Servers).
- Make branching decisions dynamically.
- Recover gracefully from errors.
In summary, LangGraph acts as a "workflow engine" specifically designed for AI-native orchestration.
MCP Server
Now comes the exciting part! NetApp has built a custom MCP server that leverages the DataOps Toolkit and integrates directly into LangGraph workflows. The MCP server provides the tools that enable the agent to orchestrate storage. It can receive high-level commands via a chatbot or workflow step and, using the DataOps Toolkit under the hood, perform tasks like provisioning volumes or creating snapshots.
Building an Agentic System for Orchestrating NetApp Volumes
With this system, NetApp is bringing storage automation into the world of agentic AI workflows. At the heart of this innovation is a custom MCP server, tightly integrated with the NetApp DataOps Toolkit. Together, they demonstrate how storage operations can be exposed and automated through conversational AI.
What exactly does it do?
- Exposes storage functions as tools via MCP.
- Orchestrates these tasks through LangGraph-based workflows.
- Allows you to issue simple commands via chatbot, and the orchestrator handles the rest automatically.
Below is a simplified representation of the architecture:

Steps in action:
- A user types into the chatbot: “List my ONTAP volumes.”
- The LangGraph chatbot takes the request and forwards it to the MCP server.
- The MCP server routes the incoming request to the relevant tool, in this case, the NetApp DataOps Toolkit.
- The chosen DataOps Toolkit function executes (in this case, listing ONTAP volumes).
- The results are returned and translated into a natural-language response for the user in the chatbot.
Why This Matters
- No need to run complex CLI commands; just chat.
- Workflows become seamless; storage is no longer a bottleneck.
- Storage operations integrate naturally into intelligent workflows, enabling automation at scale.
- Scalable and secure, backed by NetApp’s DataOps Toolkit and ONTAP Storage.
Conclusion
The era of static scripts and manual pipelines is ending. With agentic AI workflows, tasks can now be requested in plain language, orchestrated by agents, and executed automatically. Our example orchestrator, powered by the MCP server, NetApp DataOps Toolkit, and LangGraph, bridges AI-native decision-making with enterprise-grade storage automation.
This isn’t just about simplifying storage, it’s about simplifying decisions!!
Up next:
We'll provide a detailed, step-by-step guide to deploy this in your own environment. Continue to Part 2!