2012-01-25 10:08 AM
I am working on creating a couple flows, one that will create an environment and then a second that will tear it down. I want to be able to use the tear down flow in the case that a failure occurs in the creation script, basically so it does not leave a partially built environment in place. Has anyone setup a way to call for an update of DFM and then WFA to allow for a smooth cleanup? Thanks!
2012-01-25 01:13 PM
In the 1.0.2 release this is not possible; The DFM updates need to refresh and then WFA datasource need to update to pass that info to WFA.
1.1 introduces the notions of Reservations which should go a long way to addressing the DFM caching issue
2012-01-26 10:54 AM
I also have a couple of comments.
Hope this helps,
2012-01-26 11:22 AM
I'd like see DFM to offer well-defined services to update the database directly so that commands could update create the volume and then supply DFM with the info about what it created - saving the need to poll/refresh
2012-01-26 12:13 PM
This will really be a good think to have
It will simplify work very much so that if there's an important task to do and you want to be sure you have the right data you can do with something simple.
It will be nice to have something like this:
1) command to ask dfm to update it's data ( and wait it finish)
2) command to refresh cached Data in WFA (and wait it finish)
The last one will be nice if it can poll any data source with the ds name as a parameter. This will assure that important jobs will get latest data.
2012-01-26 01:15 PM
I hope this helps,
2012-01-26 04:32 PM
Hi Kevin, the first command is clear (and useful probably mostly for people developing workflow)... I do not understand exactly what the Find Oncommand Info do.
Regarding the Refresh WFA Cache, I know that with 1.1 every object changed from WFA will work and will be "known" to WFA but there will always be external object that will not be created by WFA and are needed. Snapshot are the first thing that come to my mind that can be created by backup software and can be needed to create a workflow to clone a set of volume to create a new test instance or DR.The same will be true for vcenter objects that can be created outside WFA but could be needed to do some workflow.
In that case having a Refresh WFA Cache that can run on a specific dataset can help to be sure that some specific workflow that should not fail (Disaster Recovery workflow) can run.
This can also be really useful to give normal operator the possibility to be sure that this workflow will run. Operator don't have (at least in 1.0.2) access to datasource to refresh them manually.
Just taking the example of clone from a backup set done via snapcreator (or any other backup software) and a DBA users having access to WFA to run the workflow. He should wait for at least 30 minutes (dfm refresh time for snapshot) + interval WFA get the data (and he should know this) before running the Workflow that do the cloning. Putting the refresh info into the same workflow will not work because the discover will start but the data will not be refreshed into WFA until next dataset refresh.
A method to get the data into WFA (not only DFM data) will probably be a "nice to have" addition for a future version (if nothing better will be in, I see that many nice things are into the next version).