I'm developing a migration workflow. I'd like to use a loop to iterate through some objects and migrate them (VMs). However, in order to make the best decision about the destination, WFA needs up-to-date data. So, I'm considering calling the REST-api to update some data-sources after each migration. However, this will only be valid if the finder gets executed upon each loop execution, rather than before all loop executions.
Q: Will Filters and Finders use updated WFA cache information during a WFA workflow execution?
A: No. Filters and Finders are used during the workflow planning and pre-execution phases.
I have a couple comments on what you're attempting to do for your migration workflow:
With the WFA Datasource acquisition schedule, and the WFA Reservations (that keep track of WFA actions until the datasource is updated), Have you run into issues where the environment information is not sufficiently up to date?
You may need to have two workflow approach. One that initiates the re-acquisition of information, and then the other that would process your migration activities... with an appropriate pause of a couple of minutes in between to allow the acquisition to complete.
1. Yes, I have run into issues where the environment is not sufficiently up-to-date. For example, where an OnCommand alarm triggers a workflow. This is a separate topic of course. In this case, we have a finder which selects a datastore based on performance and over-provisioning characteristics, using a weigthed algorithm. It would be useful to execute the finder (and thus the decision) during each loop iteration, in order to make the best decision along the way. As it stands, the workflow selects the desired VMs to move, and then attempts to dump them all into the first, best datastore. So, the alternative, will be to select one per workflow and move it to one.
2. Yes, a workflow to update all of my datasources is long overdue! Great idea.