Discrepancy between available IOPS and consumed IOPS (OCUM)

Hello all,


I have a customer who is using OCUM to monitor their two ONTAP 9.2 A700 HA clusters. In OCUM they see that their reported available IOPS are consumed at a much higher rate than one would expect, based on their consumed IOPS. That is to say a small increrase in consumed IOPS leads to a disproportionate depletion of available IOPS. See the attached image for a much clearer picture.




If you take a look at what happens right after the thursday label on the x-axis, you'll see that their available IOPS dropped from around 150k to 75k, while the actual delta in IOPS consumed is at most 10k.


The customer is asking why this is, and I cannot come up with an answer myself. I am hopping someone in the community knows why this could be.