When trying to save files through the SMcli on Windows, you have to use \ in front of the beginning and ending double quote marks ( " ) around the path. You also had an extra space between the quot mark and the semi-colon.
So I get a unique file name indicating the time the command started. windows task scheduler runs daily, weekly or monthly. I'll likely duplicate the script contents 4 times in the script so I can get 24 hrs of collections. Testing a single run with 3600 from task scheduler tonight..... then I'll do the necessary tweeks to Iterations so I can get a 24hr collection of data from one script.
My testing is complete. I've set an hourly task in task scheduler to collect array performance. The iterations of 3600 failed to produce a file with data..... Using 600 produces just under a 1hr file so I'll use that for now and increase the iterations as necessary to get a complete hour.
###### begin bat script ######
For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set mydate=%%c-%%a-%%b) For /f "tokens=1-2 delims=/:" %%a in ("%TIME%") do (set mytime=%%a%%b) set mytime=%mytime: =0%
set Logname="c:\logs\e-series_%mydate%_%mytime%.csv" rem Echo. >>%Logname% 2>>&1 rem Echo.=================== >>%Logname% 2>>&1
Do collections from each controller. Data about luns not owned by a controller is all 0s. Also, it only reports controller data for the controller in the query and the array level stats are the same as the controller.... so you need both controllers data (ideally collected at the same time) and merge the data afterwards with some pretty simple scripts.
I'll eventually load this data into a DB.
Also, a web proxy is available which provide REST API like functionality.
I'd definitely recommend using the REST API for this functionality in the future. We take care of a lot of the heavy lifting on collecting the data and performing analysis on it. Things like collecting data from both controllers are handled on the server so the user doesn't have to do it.