OneCollect, when installed on a MAC, gives a BUNCH of Unidentified Developer errors. I tried to Allow them one a a time, but it seems to be neverending. HOw can I overcome this?
... View more
Customer told me this - has anyone seen this? Get-NcEfficiency should output "Used" "Returns" and "Effective Used" where Used = Used Data Capacity, Returns = Storage Efficiency Savings, Effective Used = Used + Returns. Instead, the output for Returns is what should be the output for Effective Used. The actual Effective Used output does add Used + Returns, which means we can get the data we want with powershell, but using the wrong fields. This is most apparent with vol_xxx_Transfer which consumes 1 GB, has no savings from storage efficiency but reports 1012.7MB returns and 2.0 GB effective used. This was their problem: The customer is trying to track the amount of data they are loading into NetApp storage and then continue to accurately report on that data size as data is added or removed. The numbers they are trying to report on should ignore any deduplication or compression by NetApp. They're planning to make use of the PowerShell DataOntap module to automate reporting. They are seeing these numbers in the web portal (screenshot attached): •Before[Hypothetical]: Data Space Used: 29.59 TB •After: Data Space Used: 14.93 TB And, they see these numbers from the PowerShell DataOntap module (screenshot attached): •Used: 15.5 TB •Returns: 29.6 TB They're not sure which number(s) they should be using. Some of the numbers appear to possibly contradict each other, so they're just trying to get clarity around what the numbers mean and how they can report accurately on the current size of their customer's data.
... View more
It is my goal to produce a Powershell script to quickly look at an HA pair, and provide a historical report of storage growth over time, by storage type (SAS, SATA,SSD). This helps greatly in refreshes.
I have looked at, and tested, some of the Rest API's but it seems i may have to go into each Autosup, parse the aggr, to get the size. I will then have to get the Storage or Aggr, parse that was well to detrmine what type of disks are in it - with a big table of obscure model numbers to determine the disk type.
Is there an easier way?
... View more
I have installed the Eclipse/Birt in LOD, but dont see any historical tables - just current info. How do you see the historical capacity tables for vols and Aggrs?
... View more
I have a need to group many volumes together as an application and, due to the complexity - need to do this with Access. This is due to the fact I need to group them, and produce a historical report on each group. I ran the dfm export comamnd and have a list of the views. WHen I put them into Access (after an extensive Powershell scripting excersize to massage the data to fit), I found the view that seems to house the historical capacity data (DataSpacemetricView). I am trying to somehow Link that back to the VolumeView but dont see the connection occurs. Can anyone help me with this? Or is there a better way to get my report?
... View more
While I dont see this in the IMT, are there any plans to test Fusion IO or SSD drives with Edge? Would it work? Would anything prevent it from working?
... View more
I am an SE working with a large enterprise customer. I am working with our PS guy to develop a script using FPolicy. We have an issue where Snaplock Autocommit cannot finish due to the customer having 100's of millions of files in a volume. Autocommit scanner cannot complete and the application does not turn on ReadOnly attribute. The only real solution we see is an FPolicy script to go back and lock files on the file-close event. After talking with Product management, they will not have this feature anytime soon. As this customer is a very large healthcare provider, they have medical records that are not locked - which is BAD for NetApp.
How can we get info on how to write this server? Netapp Professional Services will develop and maintain this script. We had originally looked at RRE, but they script will not be needed long enough to justify the investment.
... View more
We are trying to develop a short-term workaround for Rapid Cloning a VM in XenServer/ PVS. We have found scripts that do that but they use a file copy to create the new 😧 drive. We would like to replace this with the same Rapid Cloning logic that are in VSC. Is there an example Powershell script of this - or something like it? I would have to quickly thin-clone a file within an NFS file system.
... View more
Recently, I saw an email that described the 8.1 simulator availability. Where can I get this? I notice it is not available on the standard simulator download page.
... View more