How to Automatically Backup a WFA Database

by Frequent Contributor on ‎2014-08-28 06:56 AM

As WFA becomes a important portion of the your infrastructure you may find that automated backup are important to your organization. Here is one method for performing these backups.

A backup.ps1 script exists within the C:\Program Files\NetApp\WFA\bin folder for completing these backups however I found that a few customizations helped ensure the backup process is bulletproof.

First create a copy of the script named backup_cust.ps1 this will be the script you modify.

Modify the defaults in the parameter section. This will allow the script to run without additional switch parameters. I have included the password directly in this script, if that is not acceptable you may want to export the credentials to a file. Here is one example of how to do this

Open the script with notepad or a similar text editor.

Starting at line 3 to 11 changes are in bold and red

The backup user is a valid wfa user and password.



    [parameter(mandatory=$false, HelpMessage='The user name about to perform the backup')]


    [parameter(mandatory=$false, HelpMessage='The password of the user about to perform the backup')]


    [parameter(mandatory=$false, HelpMessage='The full path of the directory to which the backup should be saved')]



The script automatically uses a single file for backup however, it may be desirable to maintain a history modify the following lines to add a date time code to the file name.

Try {

                    # Create the entire directory structure if not already present, suppress unnecessary output

                    New-Item $Path -type directory -force  | Out-Null

                    $client.DownloadFile("http://localhost:"+$httpPort+"/rest/backups", $Path+"/$(get-date -f yyyy-MM-dd-hh-mm)_wfa_backup.sql.gz")

                    Write-Output "Backup created at location : "

                    Write-Host -BackgroundColor "Green" -ForegroundColor "Black" $Path+"/$(get-date -f yyyy-MM-dd-hh-mm)_wfa_backup.sql.gz"


Save the file and test the modified script using a powershell window.

&"C:\Program Files\NetApp\WFA\bin\backup_cust.ps1"

Once validated create a scheduled task.

powershell -file "C:\Program Files\NetApp\WFA\bin\backup_cust.ps1"

Thats it! Monitor you are getting regular backups.

[Extra Credit] A cleanup script may be needed at some point to reduce the total number of backups.

adaikkap Former NetApp Employee

Nice work. Also by default WFA take daily backup at midnight 2 am or so and retains the latest 2.

abraker Former NetApp Employee

Cheers for that post Olson.

This might be slightly off topic, but I'm looking at a OCUM 6.1/WFA 2.2 integrated setup and so far have been very impressed with the non-trivial ontap protection workflows you can create   Anyhoo, I'm looking at the best way to provide some DR capabilities to this setup. Best practice document for OCUM 6.1 states snapshot (vm, level) the OCUM appliance and replicate the underlying storage (just like any vm guest).

But what about WFA that OCUM integrates with? The install doco for WFA mentions there is automatic WFA DB backups as Adai has mentioned. Could I just replicate that backup to the DR site? In the event of a DR scenario, you could setup a new WFA installation and import the backup? BUT what is OCUM going to think about this? I read that the WFA DB backup doesn't include the WFA DB key.

I could script to export the WFA DB key too (wfa.cmd command to do that) and replicate that with the WFA database. In the DR event, create new WFA installation and import key and DB. Hopefully OCUM would be happy with that..

Thoughts Adai and Olson?

Alternately, I thought about installing WFA on it's own VM server and replicate that VM to DR. In the DR event, just boot up the WFA VM from the most recent VM consistent snapshot (And worst case, import the WFA DB backup that is also located on the VM).



This NetApp Community is public and open website that is indexed by search engines such as Google. Participation in the NetApp Community is voluntary. All content posted on the NetApp Community is publicly viewable and available. This includes the rich text editor which is not encrypted for https.

In accordance to our Code of Conduct and Community Terms of Use DO NOT post or attach the following:

  • Software files (compressed or uncompressed)
  • Files that require an End User License Agreement (EULA)
  • Confidential information
  • Personal data you do not want publicly available
  • Another’s personally identifiable information
  • Copyrighted materials without the permission of the copyright owner

Files and content that do not abide by the Community Terms of Use or Code of Conduct will be removed. Continued non-compliance may result in NetApp Community account restrictions or termination.