Tech ONTAP Blogs

Step-by-Step Guide for Building an Agentic System for NetApp Volume Automation

Arpita_mahajan
NetApp
321 Views

In the first part of this series, "AI Agents meet Storage: A Blueprint for NetApp Volume Automation ," we explored the concept of agentic systems in enterprise storage. We envisioned how AI Agents could interpret intent and translate it into intelligent automation. That post set the stage for understanding the 'why' and 'what' behind agentic storage orchestration. 

 

This blog focuses on the 'how' - the bridge between vision and execution. Here, we'll walk through the practical steps to bring that blueprint to life, using LangGraph and the NetApp DataOps Toolkit (DOTK) as the foundation for AI-driven automation.

 

This guide will take you from concepts to hands-on implementation by covering: 

  1. Preparing the environment.
  2. Installing the NetApp DataOps Toolkit v2.6.
  3. Installing LangGraph MCP Client, which will act as an AI-powered chatbot.
  4. And finally, we'll bring all components together in action showcasing a demonstration of intelligent storage automation. 

 

Let's start!

 

Step 1: Prep your environment

 

  1. Python 3.8+ environment.
  2. LangGraph installed (use'pip install LangGraph').
  3. OpenAI-compatible API access.
  4. Kubernetes for containerized deployments.
  5. Trident installed and configured (version 20.07+).

 

Step 2: Installing the DataOps Toolkit (DOTK)

 

The DataOps Toolkit (DOTK) serves as the core engine powering ONTAP automation tasks. It offers a set of Python-based functions that enable you to create, clone, snapshot, or list storage volumes with ease. Since our setup operates within a Kubernetes-based environment, we'll be installing the DataOps Toolkit for Kubernetes. Follow the steps below to install the DataOps Toolkit for Kubernetes:

 

python3 -m pip install netapp-dataops-k8s

 

Once the Toolkit is installed, the toolkit requires that a valid kubeconfig file be present on the client, located at $HOME/.kube/config or at another path specified by the KUBECONFIG environment variable. Refer to the Kubernetes documentation for more information regarding kubeconfig files.

 

Step 3: Creating the MCP Client (Agent) using LangChain and LangGraph:

 

Now that the foundational component is in place, it’s time to bring everything together into a conversational and intelligent interface: the AI Agent.

 

Using LangGraph and LangChain, we’ll design an agent capable of interpreting natural language inputs and executing corresponding actions on ONTAP through the MCP server. This integration transforms standard automation into intent-driven orchestration. To create your custom agent that can interpret natural language and call the tools, refer to the "Build an Agent" guide for detailed instructions.

 

Custom Agent: Sample Snippet

Here's a sample implementation of a custom AI Agent built using LangChain and LangGraph. 

 

from langchain_openai import ChatOpenAI
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain.agents import AgentExecutor
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.agents.output_parsers.openai_tools import OpenAIToolsAgentOutputParser
from langchain.agents.format_scratchpad.openai_tools import format_to_openai_tool_messages
import asyncio
import nest_asyncio

nest_asyncio.apply()

def get_llm():
    """Initialize your LLM with your preferred provider and configuration"""
    return ChatOpenAI(
        model_name="gpt-5",  # or your preferred model
        openai_api_base="your_api_endpoint_here",
        openai_api_key="your_api_key_here",
        # Add any additional configuration
    )

async def get_mcp_tools():
    """Connect to multiple MCP servers and retrieve available tools"""
    client = MultiServerMCPClient({
        "primary_server": {
            "transport": "streamable_http",
            "url": "http://localhost:8000/mcp/"
        },
        "secondary_server": {
            "transport": "streamable_http", 
            "url": "http://localhost:8001/mcp/"
        }
    })
    return await client.get_tools()

async def create_mcp_agent(user_input):
    """Create and execute an MCP-powered agent"""
    # Get LLM and tools
    llm = get_llm()
    tools = await get_mcp_tools()  # Tools from multiple MCP servers
    llm_with_tools = llm.bind_tools(tools)

    # Define the agent prompt
    prompt = ChatPromptTemplate.from_messages([
        ("system", 
         "You are an intelligent assistant that can use various tools to help users. "
         "Always provide clear, structured responses with success/failure indicators. "
         "Format your responses with appropriate headings, bullet points, and tables "
         "when presenting results from tool calls."
        ),
        ("user", "{input}"),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ])

    # Create the agent chain
    agent = (
        {
            "input": lambda x: x["input"],
            "agent_scratchpad": lambda x: format_to_openai_tool_messages(x["intermediate_steps"]),
        }
        | prompt
        | llm_with_tools
        | OpenAIToolsAgentOutputParser()
    )

    # Execute the agent
    agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
    result = await agent_executor.ainvoke({"input": user_input})
    return result["output"]

# Example usage
async def main():
    response = await create_mcp_agent("List all available resources")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())

 

Demo

 

To bring this workflow to life, we’ve included a short demonstration showcasing the AI Agent in action. 

 

 

This demonstration encapsulates the essence of intelligent automation, where a simple human request is seamlessly translated by the AI Agent into MCP-driven orchestration, executed through DOTK, and completed with a precise ONTAP response.

 

Closing Thoughts

 

With the MCP Server as a bridge, DOTK as the foundation, and LangGraph as the workflow client, we’ve taken a decisive step toward agentic storage automation. This demonstration lays the groundwork for an AI-assisted infrastructure where NetApp's storage automation aligns seamlessly with agentic intelligence to create an ecosystem that is both adaptive and autonomous.

 

In the previous blog, we envisioned AI Agents capable of managing storage operations through natural language. Today, we have brought that vision to life, transforming automation into intelligent orchestration with your very own Storage Buddy!

 

Comments
Public