Data Infrastructure Management Software Discussions


Dynamically updating DFM and WFA?

I am working on creating a couple flows, one that will create an environment and then a second that will tear it down.  I want to be able to use the tear down flow in the case that a failure occurs in the creation script, basically so it does not leave a partially built environment in place.  Has anyone setup a way to call for an update of DFM and then WFA to allow for a smooth cleanup?  Thanks!


Re: Dynamically updating DFM and WFA?

In the 1.0.2 release this is not possible;  The DFM updates need to refresh and then WFA datasource need to update to pass that info to WFA.

1.1 introduces the notions of Reservations which should go a long way to addressing the DFM caching issue


Re: Dynamically updating DFM and WFA?

I also have a couple of comments.

  • Better update of information in WFA?  Rich has the right answer that this is not possible in WFA 1.0.2, however WFA 1.1 (due out in a few weeks) will cache all WFA actions (performed or scheduled) in addition to the information pulled from DFM.

  • Refreshing the DFM data.  Or more applicably, forcing a refresh of DFM data.  This would be another WFA command.  I've actually put together a couple of commands that help to facilitate this.  I will be cleaning them up and polishing them... and will post them as WFA Command Examples ASAP.  One thing to bear in mind is that there are really two steps here.
    • refresh the OnCommand info for a particular controller
    • 'polling' to confirm that OnCommand can now see that new storage object.

Hope this helps,



Re: Dynamically updating DFM and WFA?

I'd like see DFM to offer well-defined services to update the database directly so that commands could update create the volume and then supply DFM with the info about what it created - saving the need to poll/refresh


Re: Dynamically updating DFM and WFA?

This will really be a good think to have 🙂

It will simplify work very much so that if there's an important task to do and you want to be sure you have the right data you can do with something simple.

It will be nice to have something like this:

1) command to ask dfm to update it's data ( and wait it finish)

2) command to refresh cached Data in WFA (and wait it finish)

The last one will be nice if it can poll any data source with the ds name as a parameter. This will assure that important jobs will get latest data.


Re: Dynamically updating DFM and WFA?


To clarify:

  1. I have a couple of command examples that I hope to post soon.
    • Refresh OnCommand Info - this is to initiate the refresh of OnCommand information for a particular controller. This command will initiate a 'dfm host discover' for a storage system and then complete.
    • Find OnCommand Info - This is a separate command.  I did this as a separate command because I have created several storage objects on the controller(s), but I don't need to issue a refresh for every one.  I do want to ensure the newly created storage objects are visible, however.
    • expectations: I see people being able to use the commands I will post as is.  I also see some small modifications required depending on use case.
  2. Refresh WFA Cache.  We don't have this command.  With WFA 1.1 we will not need this command.  I agree in some scenarios this would be nice to have now, but WFA 1.1 is just around the corner.

I hope this helps,



Re: Dynamically updating DFM and WFA?

Hi Kevin, the first command is clear (and useful probably mostly for people developing workflow)... I do not understand exactly what the Find Oncommand Info do.

Regarding the Refresh WFA Cache, I know that with 1.1 every object changed from WFA will work and will be "known" to WFA but there will always be external object that will not be created by WFA and are needed. Snapshot are the first thing that come to my mind that can be created by backup software and can be needed to create a workflow to clone a set of volume to create a new test instance or DR.The same will be true for vcenter objects that can be created outside WFA but could be needed to do some workflow.

In that case having a Refresh WFA Cache that can run on a specific dataset can help to be sure that some specific workflow that should not fail (Disaster Recovery workflow) can run.

This can also be really useful to give normal operator the possibility to be sure that this workflow will run. Operator don't have (at least in 1.0.2) access to datasource to refresh them manually.

Just taking the example of clone from a backup set done via snapcreator (or any other backup software) and a DBA users having access to WFA to run the workflow. He should wait for at least 30 minutes (dfm refresh time for snapshot) + interval WFA get the data (and he should know this) before running the Workflow that do the cloning. Putting the refresh info into the same workflow will not work because the discover will start but the data will not be refreshed into WFA until next dataset refresh.

A method to get the data into WFA (not only DFM data) will probably be a "nice to have" addition for a future version (if nothing better will be in, I see that many nice things are into the next version).


Try the NEW Knowledgebase!
NetApp KB Site