Active IQ Unified Manager Discussions

Requesting some clarification - Custom data source from CSV

Gag_Halfront
8,281 Views

I am trying to integrate another vendor's storage product to be used with our existing WFA implimentation.  The other vendor does not have any kind of existing automation so this is our best path forward.  We have some commands created for WFA that will do some basic provisioning tasks, dictionary entries and whatnot.  We have a perl script that can be run from the WFA server.  The perl script will reach out to the management server for the storage device and gather data which it then sticks into a number of CSV files - one for each table in the scheme.  Where I'm having a problem is getting WFA to execute the script and then import the CSV data as part of a data source acquisition.

From the command line, I can do C:\path\to\perl.exe myscript.  This will run myscript and create the .csv files in the current directory.  I've looked and not been able to find any good documentation on how I turn this .csv-generating perl script into a working data source.  Where do i put the script?  Where does it need to create the .csv files?  Do I just put "C:\path\to\perl.exe C:\path\to\myscript" in the script box of the Data Source Type?  I've tried this and it doesn't work.  Either WFA isn't running the script, or the script is putting the files in an unanticipated place.

Any leads or links to docs would be great.

1 ACCEPTED SOLUTION

sinhaa
8,280 Views

David,

First I would strongly recommend to watch this short video on Building your first script based custom datasource using a file. Its very informative. https://communities.netapp.com/videos/3351

@how I turn this .csv-generating perl script into a working data source.  Where do i put the script? Do I just put "C:\path\to\perl.exe C:\path\to\myscript" in the script box of the Data Source Type?  I've tried this and it doesn't work.

------

The csv-generating perl script can be placed anywhere on the WFA server. Lets assume you  are having it at C:\fakepath\test.pl

So to execute this perl script from WFA Datasource write the following line in the script block.

Start-Process Perl C:\fakepath\test.pl -NoNewWindow

If somehow Perl in not in your PATH variable, you can give the full path to perl.exe as well.

@Where does it need to create the .csv files?

------

The csv-files can be created  anywhere in your WFA server, but they finally need to be at Data source's pwd which is <install_location>\WFA\jboss\standalone\tmp\wfa. So let your csv-generating perl script create CSV files anywhere you want, lets say C:\my_csv and then Copy all the CSV created to the right place using the following command at the last.

Copy-Item C:\my_csv\* ./ -Force

This is also suggested in the video.

So the script block of the Datasource should have the below code.

=====

#Execute the perl script at C:\fakepath\test.pl to generate CSV files at  folder C:\my_csv folder

Start-Process Perl C:\fakepath\test.pl -NoNewWindow

#Now copy all the CSV files into the PWD.

Copy-Item C:\my_csv\* ./ -Force

===========

sinhaa



If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

View solution in original post

12 REPLIES 12

RLEHRHART
8,217 Views

I don't have WFA in front of me any longer, but as I recall, you can add a PowerShell script to the a data source.

Regards,

   - Rick -

abhit
8,217 Views

Hi David:

This video might help you.

https://communities.netapp.com/videos/3351#comment-17935

Regards

Abhi

sinhaa
8,281 Views

David,

First I would strongly recommend to watch this short video on Building your first script based custom datasource using a file. Its very informative. https://communities.netapp.com/videos/3351

@how I turn this .csv-generating perl script into a working data source.  Where do i put the script? Do I just put "C:\path\to\perl.exe C:\path\to\myscript" in the script box of the Data Source Type?  I've tried this and it doesn't work.

------

The csv-generating perl script can be placed anywhere on the WFA server. Lets assume you  are having it at C:\fakepath\test.pl

So to execute this perl script from WFA Datasource write the following line in the script block.

Start-Process Perl C:\fakepath\test.pl -NoNewWindow

If somehow Perl in not in your PATH variable, you can give the full path to perl.exe as well.

@Where does it need to create the .csv files?

------

The csv-files can be created  anywhere in your WFA server, but they finally need to be at Data source's pwd which is <install_location>\WFA\jboss\standalone\tmp\wfa. So let your csv-generating perl script create CSV files anywhere you want, lets say C:\my_csv and then Copy all the CSV created to the right place using the following command at the last.

Copy-Item C:\my_csv\* ./ -Force

This is also suggested in the video.

So the script block of the Datasource should have the below code.

=====

#Execute the perl script at C:\fakepath\test.pl to generate CSV files at  folder C:\my_csv folder

Start-Process Perl C:\fakepath\test.pl -NoNewWindow

#Now copy all the CSV files into the PWD.

Copy-Item C:\my_csv\* ./ -Force

===========

sinhaa



If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

Gag_Halfront
8,217 Views

Thank you for the detailed response.  I thought my problem was sort of that I couldn't figure out the environment that the script was running in.  When perl opened a file for output, I had no idea where it was putting it so I didn't know where to go get it with Copy-Item.  When running from the command line, it just dumps them in the current directory.  I had been assuming that the files were being created somewhere and I just didn't know where.  I modified the script to put the files in a known location (C:\tmp).  When i re-ran the acquisition it still didn't work.  The files were not created so I don't think that the perl script is even being called.  The WFA acquisition fails with a message of "Table hnas.viviol does not exist". 

So I guess I'm still a little confused about how this works.  It seems like a catch 22 that I have to run the acquisition to populate the database, but the acquisition isn't running because a table doesn't exist.

Gag_Halfront
8,217 Views

Ok...  in a different data source that I created for the sole purpose of testing this perl script, I ran two ways.  When I had the data source script use Start-Process Perl "C:\path\to\my stupid\perl\script" it fails and when I check the C:\tmp directory, there are no files.  I get the same result when I use the full path to Perl.  When change the script to use $output = C:\path\to\perl "C:\path\to\my stupid\script" I _do_ get files in the C:\tmp directory.

So we'll stick with $output = for now.  At least it's doing something.  What i can't fathom is the catch 22 on the new schema though.  Even with the $output thing, it never even tries to run the script.  Instead it fails with a message about table doesn't exist.  The whole point of trying to run the data source acquisition is to create the table.  That would be like if the power button on my TV didn't work because the TV wasn't turned on. 

There has to be something I'm missing here.  I've watched the video a bunch of times and he doesn't pre-create the tables somehow.  He doesn't have to prime the pump. 

Do the files have to already exist the first time you run the acquisition?  Is it even possible to create the first set of tables from scratch using the script in the data source type definition?

sinhaa
8,217 Views

@When I had the data source script use Start-Process Perl "C:\path\to\my stupid\perl\script" it fails and when I check the C:\tmp directory, there are no files.  I get the same result when I use the full path to Perl.

-------

But I didn't put any double-quotes for the path to the perl script file, did I? Putting quotes and no quotes do not mean the same things in this cmdlet, its not failing but doing something which is not desired by you. If you wanted to put quotes because in order to complicate things, you have placed the perl script in a folder with <space> in its name, putting quotes is not the way.

For that do this and exactly this.

Start-Process perl test.pl -WorkingDirectory 'C:\my path'

@ Even with the $output thing, it never even tries to run the script.

------

What script it didn't try to run? The Perl script? The Powershell script in the code block? Kindly elaborate.

@Instead it fails with a message about table doesn't exist.  The whole point of trying to run the data source acquisition is to create the table.

-------

There are multiple points where it can fail and I can't point to it because you haven't provided enough information. Kindly post the screenshot of the failure message. Download AutoSupport Data file and send it to me.  Send it by mail you won't want to put it here, send it to me at sinhaa at netapp dot com.

My intent is to see what are you trying to do, because I've not been able to get it clearly.

If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

ktim
7,090 Views

In my experience the tables are not created until you perform a "reset scheme" on that particular schema. This will create the tables.

Acquisition will populate the tables, but it will not create them.

Regards,

Tim

sinhaa
7,090 Views

For a new dictionary item and a New Datasource type added, no Reset scheme is required. I think there was a bug which required you to reset the scheme after a restore operation.

If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

Gag_Halfront
7,090 Views

Actually, the reset did get it past the original problem.  Thanks, Tim!

After I did the reset, then it got to where it was complaining about not being able to find the .csv files.  That's where Sinhaa's advice helped.  It took both of you guys to get this thing working.

I appreciate all the help.

adaikkap
8,218 Views

Hi David,

     Would you mind telling us which other vendor storage that you trying to automate with WFA ?

Regards

adai

Gag_Halfront
8,217 Views

Adai,

I am working on integrating Hitachi HNAS into my environment.  A Hitachi developer helped create the components for integration.  Now we're just trying to figure out why they work in his lab but not in mine.  We have a script that contacts the HNAS management server and gathers information about connected HNAS clusters similar to what DFM collects for NetApp servers.  We then have a parser that takes the output of the previous script and converts it to the CSV files required for importing into WFA.  They also prepared the dictionary entries and a few basic commands using a gateway tool that talks to the HNAS.  The rest will be up to me.

adaikkap
8,217 Views

Thanks David for the information.

Regards

adai

Public