Microsoft Virtualization Discussions

Connecting and issueing commands to Multiple Storage Controllers Simulaneously

elee
5,689 Views

Hi All,

I have a script that connects to my all filers in a serial manner to gather information(volume space, aggr space, snapshots, etc....).

The script takes a very long time to execute and I want to reduce the execution time by having it run in parallel.  I want to run the script in a thread for each controller.

My problem is that the code in the thread executes until it tries to runs the code that connects to a storage controller.  From that point forward the thread is in a pepetual running state.

Q:  Is it possible to have multiple threads each connected to a different storage controllers and running my script? If so what am I doing wrong?

TIA

Eugene

Here's what I have created ...... two different scriptblocks each one with code to connect to a different controller.

Pseudo code below....

-----------------------------------

$thread1 = { connect to controller1 ; get all volnames ; dump volnames to log file1 }

$thread2 = { connect to controller2 ; get all volnames ; dump volnames to log file2 }

Start-Job -Scriptblock $thread1

Start-Job -Scriptblock $thread2

Wait for all Jobs to complete......

-----------------------------------------

connect to controller code.....

$filer = "storagecontrollername"

$password = ConvertTo-SecureString "password" -AsPlainText -Force

$cred = New-Object -TypeName System.Manager.Automation.PSCredential -ArgumentList "root",$password

Connect-NaController $filer -Credential $cred

1 ACCEPTED SOLUTION

elee
5,689 Views

Hi All,

  I created the below launch.ps1 script to launch filer specific scripts (each script only gets info from one controller) so that I can gather data from all my storage controllers at the same time and then write to individual log files for that storage controller.

My issue right now is that the scripts launch but then stalls (no output to my log file) when calling on the Connect-NaController cmdlet.  From that point on nothing else is executed.

What can I do to debug this issue?  Is thread safety an issue here?

TIA

Eugene

# --------------------------------------------------------------------------

# PS script to launch all my scripts so they run in parallel

"Starting"

Get-Job | Remove-Job

$job1 = Start-Job -Filepath d:\scripts\filer1.ps1

$job2 = Start-Job -Filepath d:\scripts\filer2.ps1

$job3 = Start-Job -Filepath d:\scripts\filer3.ps1

$exit = $false

while ($exit -eq $false)

{

$a = @(Get-Job -State Running).count

"$a jobs are currently running."

if ($a -eq $true){$exit = $true}

sleep -Seconds 60

}

"Finished"

}

# ---------------------------------------------------------------------

# An example of the self contained scripts that are launched

Import-Module DataONTAP

function Connect2Filer

{ #code }

function GetDriveCountInformation

{ #code }

# MAIN

$filer = "ztop.somewhere.com"

$cs = Connect2Filer $filer

$dci = GetDriveCountInformation

$dci | out-file -filepath c:\temp\ztop.log -Append

Ok ... guys I could not get the above code to work (XP and W2K3) and did not have the time to figureout why. I did a work around by replicating my scripts for each of my storage controllers. I then used a batch file to kick them all off so they would run in their own. Not pretty but it works

kickoff.bat

---------------------------------------------------------------------------

start "controller 1"  powershell .\myscript.controller1.ps1

start "controller 2"  powershell .\myscript.controller2.ps1

start "controller 3"  powershell .\myscript.controller3.ps1

start "controller 4"  powershell .\myscript.controller4.ps1

---------------------------------------------------------------------------

View solution in original post

4 REPLIES 4

joseconde
5,689 Views

Haven't tried it but what if you dumped (out-file) each scriptblock code into separate PS1 files then executed Start-Job on each PS1 file instead.

beam
5,689 Views

Hi Eugene,

I can't tell if you are doing this from the pseudo code, but you need to import the DataONTAP module in each of the script blocks.  I tried this and got the expected results:

PS C:\Users\Administrator> $thread1 = { ipmo Dataontap; connect-nacontroller 10.61.169.28; get-navol | out-file C:\log1.txt }

PS C:\Users\Administrator> $thread2 = { ipmo Dataontap; connect-nacontroller 10.61.169.29; get-navol | out-file C:\log2.txt }

PS C:\Users\Administrator> Start-Job -ScriptBlock $thread1 ; Start-Job -ScriptBlock $thread2

The file log1.txt contained the volume listing for the first controller and log2.txt contained the volume listing for the second controller.

Hope that helps!

-Steven

paleon
5,689 Views

You can simplify the code for connecting to the NetApp controllers by using the Add-NaCredential cmdlet.


If you run this script once:

$filer = "storagecontrollername"

$password = ConvertTo-SecureString "password" -AsPlainText -Force

$cred = New-Object -TypeName System.Manager.Automation.PSCredential -ArgumentList "root",$password

Add-NaCredential -Controller "storagecontrollername" -Credential $cred

You can then run this script successfully (from the same Windows host while logged in as the same Windows user)

Connect-NaController -Name "storagecontrollername"

elee
5,690 Views

Hi All,

  I created the below launch.ps1 script to launch filer specific scripts (each script only gets info from one controller) so that I can gather data from all my storage controllers at the same time and then write to individual log files for that storage controller.

My issue right now is that the scripts launch but then stalls (no output to my log file) when calling on the Connect-NaController cmdlet.  From that point on nothing else is executed.

What can I do to debug this issue?  Is thread safety an issue here?

TIA

Eugene

# --------------------------------------------------------------------------

# PS script to launch all my scripts so they run in parallel

"Starting"

Get-Job | Remove-Job

$job1 = Start-Job -Filepath d:\scripts\filer1.ps1

$job2 = Start-Job -Filepath d:\scripts\filer2.ps1

$job3 = Start-Job -Filepath d:\scripts\filer3.ps1

$exit = $false

while ($exit -eq $false)

{

$a = @(Get-Job -State Running).count

"$a jobs are currently running."

if ($a -eq $true){$exit = $true}

sleep -Seconds 60

}

"Finished"

}

# ---------------------------------------------------------------------

# An example of the self contained scripts that are launched

Import-Module DataONTAP

function Connect2Filer

{ #code }

function GetDriveCountInformation

{ #code }

# MAIN

$filer = "ztop.somewhere.com"

$cs = Connect2Filer $filer

$dci = GetDriveCountInformation

$dci | out-file -filepath c:\temp\ztop.log -Append

Ok ... guys I could not get the above code to work (XP and W2K3) and did not have the time to figureout why. I did a work around by replicating my scripts for each of my storage controllers. I then used a batch file to kick them all off so they would run in their own. Not pretty but it works

kickoff.bat

---------------------------------------------------------------------------

start "controller 1"  powershell .\myscript.controller1.ps1

start "controller 2"  powershell .\myscript.controller2.ps1

start "controller 3"  powershell .\myscript.controller3.ps1

start "controller 4"  powershell .\myscript.controller4.ps1

---------------------------------------------------------------------------

Public