Suggest you combine two filters like the following in a finder and then use it. Filter aggregates by disk type Filter aggregates by delegation to Storage Virtual Machine Regards adai
... View more
Hi Dave, Any Workflow run has two phases, namely planning and execution. Irrespective of Execute Immediately or Later, the planning always happens immediately, its only the execution of the planned activity can be run either immediately or later. So what you are seeing is the expected behavior. Pls wait for a some more time, once WFA 3.0 is released, you will have the ability to include reservation in custom commands as well. This will solve your problem. Regards adai
... View more
Hi Nigel, The upcoming release of OCUM is supported on non VMware environments as well, like RHEL stand alone isntaller. Now before I answer, its good to understand what are the advantages of having a datasources in WFA. The power of WFA comes from its decision making or resource selection logic on which resource needs to be selected in order to create/delete/migrate a volume/SVM etc. In order to do that WFA needs data about the complete NetApp Datacenter to understand and make intelligent selection based on attributes like inventory,capacity and performance data. So you can either use them in isolation or in combination before creating a volume or SVM. Like if you want to create SVM that will server CIFS data then, the cluster on which you create should be licensed for cifs and the service should be up and running. This kind of selection can be done using finder/filters in WFA which in turn uses the Datasource information cached in WFA MySQL database. Similarly, if you want to create a volume on a SVM, you need to choose an aggr. The WFA datasource cache can help you in not just selecting an aggr that has enough space for the volume to be created, but also to make sure in choosing the one which has sufficient days or even years before its full. So that you need not keep moving the volume from this aggr. Also it can help you choose not just based on size, type and growth characteristics of the aggr but also performance. So if you dont want any of these, then you can make all the resource selection logic in WFA as User Inputs and still make the workflows run. Regards adai
... View more
Hi Dave, Not that I know of. But let me check with the Engg team if there are any hacks. Regards adai Message was edited by: Adaikkappan Arumugam
... View more
Hi Norman, What you are really trying to do is resource selection based on location of the user who invoke the workflow. For this you you will somehow need do a mapping of the location to user to resource. In such situation we generally recommend to use playground database to do such persistent mapping. If the mapping is mostly static, then you can use a playground DB and store the mapping from location to cluster name Or, you could use a function to return cluster name given the location and then use that to load the “Cluster by key” in a no-op command After that the rest of the workflow can use the cluster variable where it would be properly filled i.e cluster.name Other way is Store in playground DB the mapping First user input can be location $Location Once chosen, the next locked query based user input named $ClusterName will be based on $Location (This is query based, but will always have single item in drop-down) Solution Courtesy, Shailaja Regards adai
... View more
Hi Dave, In the mean time use the refresh command to get you data in sync and avoid delays in your workflow. https://communities.netapp.com/message/124437#124437 BTW could you let know the some ball park numbers on what you mean by high change rates ? Like how many workflows you will run in an hour, in a day etc ? Regards adai
... View more
Hi Roger, Unfortunately, there isn't a way. This is being worked for an upcoming release. Untill then pls reach out to your account team to get a workaround. As we may not able to share the root password for the WFA DB, though its unique for each instance of WFA server. Regards adai
... View more
Thanks Aaron for stepping in. I was about to reply the same to Dave. We will not be able to give you the date in the public forum but be rest assured the release is not very far. Aaron should be able to get you the exact timelines. Regards adai
... View more
Hi As such there is no support for WFA backup with Snapdirve or so. The best you can do is the take perodic backup of WFA using powershell. - Documentation/Details around the backup PS cmdlet is in “Installation and Setup Guide” “Backing up the WFA database using PowerShell script” o This PS cmdlet is implemented using a REST API for taking backups in WFA server. - Also, WFA takes nightly backups in a folder WFA-Backups on the sever every night a 2:00AM. - When migrating the backup to a new WFA installation on DR site, ensure you follow the instructions in “Installation and Setup Guide” “Migrating the OCWFA installation”. This is absolutely necessary to ensure credentials work fine in the DR WFA server. Thanks, Adai
... View more
Hi Roger, Certifying your command will not solve the issue. For now you will have to do it either by adding sufficient wait times between workflows so that cache is updated or use a refresh command in your workflow to get the data currency. BTW the upcoming release of WFA, will allow you take care of reservation even in case of custom commands that aren't certified. Regards adai
... View more
Hi Muru, Why dont you use your Excel as a datasource to your workflow so that the name/increment is taken care as part of the workflow itself ? Regards adai
... View more
Hi Patrick, Can you pls open a case with NetApp Support ? BTW is this is OCUM 5.2 7Mode right ? ESP against the bug 835921 ? You are facing this issue due to a stale or duplicate entry in your OCUM database. Regards adai
... View more
Hi As Sinhaa said, you can use the SNMP traps to trigger workflows using WFA RESTful Api. BTW what version of OCUM are you using ? If its 5.x you can even call a script in the context of an alarm which is tied to an event. This script in turn can call the WFA workflow using RestFul Api. BTW could you explain more on what kind of trigger are you looking for ? Regards adai
... View more
Hi Scott, As rightly pointed out by Sinhaa, its a known issue fixed in WFA 2.2GA release. Take a look at this bug for more details on the same 802745. Regards adai
... View more
Hi If I understand your question right in the same workflow you want to provide the option to attach a QOS policy and option to enable or NOT snapshots for the volume. All you have to do is add the command "Create QoS Policy Group" and make the iops as Enum input with values 250,500 & 1000 and in the volume command make Snapshot as Boolean Input with TRUE or FALSE for Snapshot activation. Hope this helps. Regards adaio
... View more
Hi Sven, Can you pls open a case with NetApp Support and add it to the bug #829439 ? That would give you the timeline for the fixed version, workarounds if any etc. Regards adai Message was edited by: Adaikkappan Arumugam
... View more
Thats a very nice workaround Abhi. But on a long term basis, I think we should expose some of these tables, for compliance, data analytics or reporting purpose to end users. If not tables at least some views. Regards adai
... View more
Hi Sheel, Before I come to your pointed questions, let me see if I understood what you are trying to achieve. Are you trying to automate the dataset creation in your 8 DFM/ProtMgr Servers from WFA so that you dont have to login to individual DFM/PM server. If that's a right understanding, then the simple answer to all your questions is its possible, but not completely out of the box. You may have to write some custom command, for which there are sample out to the box as well as in the community. Also there may be need to add a playground database with your 8 DFM server for the resource selection. Now coming to your specific questions Pls find some of my response below. Do we have any Workflow available which can do something like this ? Not out of the box, but there are sample in the communities. We have approx 8 DFM servers acting as Protection manager servers. I am looking for a possibility to query the DFM servers added as a Datasource, so that : - I can select the DFM server where I would like create a dataset and add volumes to it. - be able to list down the available protection policies which can be attached to a dataset. First we have to see if these are cached in WFA, from the DFM, if not either write a custom datasource or since many of these information are pretty static, we may even use the playground database. I cannot find any item in Dictionary item regarding DFM components and be able to add or remove objects in DFM server. There are dictionary items to like Resource Groups, Resource Group Members, Resource Pools dataset. You can extend any of these are build new ones as well. Also there are following commands Create Dataset Add volume to dataset Create Resource Groups Add members to Resource Groups Regards adai
... View more
Pls take a look at this thread for PipeLine Variable. https://communities.netapp.com/thread/36168 Also an RFE for Conditional Approval points is opened. Pls open a case with NetApp Support and add the same to RFE # 735785 Regards adai
... View more