Tech ONTAP Blogs
Tech ONTAP Blogs
To gain a deeper understanding of the challenges our solution addresses, consider reading our previous blog post. It explores the limitations of manual storage provisioning and provides the broader context for the AI-driven automation approach discussed in this article. It offers valuable background for the advanced agent architecture presented below.
In the era of AI-driven transformation, the ability to manage large-scale data infrastructures with natural language is a game-changer. Imagine a world where a simple chat command can provision storage volumes, create snapshots, or clone environments. In this blog, we’ll walk through building such an agent using the NVIDIA NeMo Agent Toolkit, connecting it to a NetApp DataOps Toolkit via an MCP Server, and ultimately volumes on an ONTAP Storage System using simple, natural language.
At the heart of this solution is a tightly integrated, multi-layered architecture:
This modular approach ensures that the agent logic, the communication protocol, and the data operations are all handled by specialised components, creating a robust, secure, and scalable solution.
To get started, you’ll need access to an MCP server that exposes the functions of the NetApp DataOps Toolkit. The MCP protocol enables AI agents to securely interact with external tools and data sources by abstracting the underlying API.
In this implementation, the MCP server hosts the NetApp DataOps Toolkit, defines functions that map to its capabilities, and exposes these functions as tools accessible to any compliant MCP client such as the NeMo agent.
The good news is that you don’t need to build this yourself! We’ve already set up and released a ready-to-use MCP server for you. This example demonstrates how you can leverage the MCP server we’ve built and made available for your convenience.
Note: For this custom implementation, we utilized the traditional approach of the NetApp DataOps Toolkit, hosting it with stdio as the transport mode to facilitate seamless communication between components.
The next step is to build the custom NeMo agent, which will serve as the intelligence layer of our system. The process begins by cloning the NeMo Agent Toolkit's utility, which provides a robust foundation and examples to build upon.
Once you have the toolkit, you can begin developing your custom agent. This involves defining the specific functions that will interact with your MCP server. For an in-depth guide on creating a new workflow, including custom agent creation, refer to the NVIDIA NeMo Agent Toolkit documentation.
Here’s a sample snippet from our custom AI agent implementation. In this example, we use an LLM proxy to connect to an external language model, such as GPT-5.
@register_function(config_type=MyAgentFunctionConfig, framework_wrappers=[LLMFrameworkEnum.LANGCHAIN])
async def my_agent_function(
config: MyAgentFunctionConfig, builder: Builder
):
"""
NetApp Storage Agent function that uses LangChain to interact with NetApp storage tools.
"""
# Import necessary LangChain components inside the function
from langchain.agents import AgentExecutor
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.agents.output_parsers.openai_tools import OpenAIToolsAgentOutputParser
from langchain.agents.format_scratchpad.openai_tools import format_to_openai_tool_messages
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage
# Get tools from builder using config
tools = builder.get_tools(tool_names=config.tool_names, wrapper_type=LLMFrameworkEnum.LANGCHAIN)
# Initialize LLM (ChatOpenAI) with custom SSL settings
import httpx
sync_http_client = httpx.Client(verify=False)
async_http_client = httpx.AsyncClient(verify=False)
llm = ChatOpenAI(
model_name="gpt-5",
openai_api_base="YOUR_API_BASE_URL", # <-- Replace with your own base URL
openai_api_key="YOUR_API_KEY", # <-- Replace with your own key!
model_kwargs={'user': 'your_username'},
max_retries=3,
http_client=sync_http_client,
http_async_client=async_http_client
)
# Bind tools to the LLM
llm_with_tools = llm.bind_tools(tools)
# Define the agent prompt
prompt = ChatPromptTemplate.from_messages([
("system", "You are an advanced NetApp storage management assistant."),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
])
# Create the agent chain
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_tool_messages(x["intermediate_steps"]),
}
| prompt
| llm_with_tools
| OpenAIToolsAgentOutputParser()
)
# Create the AgentExecutor
agent_executor = AgentExecutor(
agent=agent,
tools=tools,
verbose=True,
max_iterations=3,
handle_parsing_errors=True,
)
# Define the main function that will be called
async def _response_fn(input_message: str) -> str:
"""
Process user input and generate response using NetApp storage tools.
"""
import asyncio
try:
result = await asyncio.wait_for(
agent_executor.ainvoke({"input": input_message}),
timeout=90.0
)
if isinstance(result, dict) and "output" in result:
return result["output"]
else:
return str(result)
except asyncio.TimeoutError:
return "Request timed out. Please try again."
except Exception as e:
return f"Error: {str(e)}"
# Yield the function for integration
yield FunctionInfo.create(single_fn=_response_fn)
The “workflow.yaml” file is the core configuration for the NeMo agent, instructing it on how to behave, what tools to use, and how to connect to external services and tools.
# SPDX-FileCopyrightText: Copyright (c) 2024-2025, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
# Workflow configuration to connect to NetApp DataOps MCP Server
# This will automatically discover and use all tools from MCP server
general:
use_uvloop: true
functions:
# Load all tools and use the names and descriptions from the server
netapp_tools:
_type: mcp_client
server:
transport: stdio
workflow:
_type: my_agent
# List of all the tools that are needed for the workflow from the MCP server
tool_names:
- "ListVolumes"
- "CreateVolume"
- "CloneVolume"
- "MountVolume"
- "CreateSnapshot"
- "ListSnapshots"
- "CreateSnapMirrorRelationship"
- "ListSnapMirrorRelationships"
# Description: You are a helpful assistant that helps users manage NetApp storage resources using the NetApp DataOps platform.
The YAML file does the following:
With the custom agent and configuration in place, you can now launch the agent and connect to the UI.
NeMo Agent Toolkit UI (Chatbot)
Updating NeMo Agent Toolkit Settings
By combining the power of NVIDIA NeMo Agent Toolkit, NetApp DataOps Toolkit (via MCP), and ONTAP storage, you can revolutionize data management with conversational AI. This architecture not only simplifies complex operations but also empowers users to interact with storage infrastructure in a natural, intuitive way.
Whether you’re a storage admin, data scientist, or developer, this solution can be tailored to your needs. The modular design allows for easy extension—add new intents, integrate more APIs, or enhance the UI for a richer experience.