Deep Learning (DL) is the subfield of Artificial Intelligence (AI) that focuses on creating large neural network models capable of data-driven decisio ...read more
In the first post of our series, we explored the AI/ML workflow through the lens of a Medallion Data Architecture. We explained our rationale to ident ...read more
The new SSD capacity decrease capability of FSx for ONTAP Gen-2 file systems, transforms high-performance storage workloads management on AWS, offerin ...read more
I'm excited to kick off a new blog series called Back to Basics (B2B). The goal is to revisit fundamental concepts that often slip through the cracks ...read more
This post walks you through how to run an MCP (Model Context Protocol) server on LMStudio using powerful functions from the NetApp DataOps Toolkit. This setup lets you implement custom storage actions directly from your LLM-powered workspace on your laptop or workstation, enabling data management for AI and ML workflows.
For more information on MCP servers with the NetApp DataOps Toolkit, refer to this post.
Prerequisites
To run an MCP server in LMStudio for the NetApp DataOps Toolkit, you would need the following -
- Download and install LMStudio here. Ensure you have enough storage and memory to run a model locally.
- Install uv.
- Create a DataOps Toolkit config file using the command shown here.
Steps
Before you jump into the implementation, it is necessary to add or modify the mcp.json file in LMStudio. You can find it under 'Integrations'. Below is an example of mcp.json config file.
{
"mcpServers": {
"netapp_dataops_ontap_mcp": {
"command": "uvx",
"args": [
"--from",
"netapp-dataops-traditional",
"netapp_dataops_ontap_mcp.py"
]
},
"netapp_dataops_k8s_mcp": {
"command": "uvx",
"args": [
"--from",
"netapp-dataops-k8s",
"netapp_dataops_k8s_mcp.py"
]
}
}
}
The following video is a walkthrough of running an MCP server on LMStudio using NetApp DataOps Toolkit functions.
Using this minimal setup, you can implement an MCP server with the NetApp DataOps Toolkit and run it locally on your machine.
Refer to the NetApp DataOps Toolkit Github repository for more information.
... View more
FSx for ONTAP can be monitored using Amazon CloudWatch, which gathers and processes raw data from FSx for ONTAP in order to create readable, near real-time metrics. In this post, I’ll cover the basics of using Amazon CloudWatch metrics to monitor FSx for ONTAP performance and discuss some conclusions you can draw from those metrics.
... View more
From June through August 2025, NetApp delivered a series of enhancements to Workload Factory that further simplify deployment, improve resilience, and automate storage operations on FSx for ONTAP workloads.
... View more
This article was originally published in March 22 on netapp.io, discussing a specific use case for a single NetApp Trident PVC being used by multiple Kubernetes clusters.
... View more