Active IQ Unified Manager Discussions

When develop with ONTAP API vs WFA API

arnout
4,625 Views

Hi,

IHAC who is programming OpsCode Chef Integration with the ONTAP API.

I am trying to understand when is it appropriate to program against the ONTAP API and when does the WFA API makes more sense?

Thanks,
Arnout

3 REPLIES 3

rle
NetApp Alumni
4,625 Views

Hi Arnout,

It all depends on what you want to do.

WFA is an excellent tool for doing a series of steps.  Let's say you want to repeatedly create a volume with a set of specific characteristics with 3 LUNs.  You could code the API invokes to volume-create, volume-modify, lun-create, etc; or you could create a WFA workflow, and invoke that workflow.  However, it you wanted to monitor volumes, aggregates space utilization, ONTAP and OnCommand APIs are more appropriate.

WFA currently relies on OnCommand Unified Manager to collect information from ONTAP.  Some management software don't like this architecture, so they use ONTAP APIs directly.  Using ONTAP APIs is more coding and testing, but there is less indirection.

ONTAP and OnCommand APIs use an SDK. WFA APIs use REST.

I hope this helps and others will chime in,

   - Rick - 

mgoddard
4,625 Views

Hi Arnout,

One distinction I use whether its a point product solution or a shared solution. With shared solutions favoring a shared storage provisioning point.

If the storage controller is for use by a single project or say Chef will control all or most of the provisioning on the storage controller for a group of related project. That's probably a good solution.

Where you have a shared storage environment, and Chef, or manual setup, or general storage services or another storage provisioning component are also provided to other users off the same hardware; WFA can be useful to ensure standard storage layouts are provisioned, and avoid race conditions between multiple provisioning methods competing with each other as they likely cannot be coordinated. Say for instance that last 1Tb of usable space on Tier X could be consumed by Chef and a manual configuration where both did the check for free space first, and then both did the actual provisioning, the second would fail without thin provisioning, or over-provision too far with it. Where it perhaps should have gone to a different aggregate. Similar problems can happen with name generation quite easily too (ie. Volume names must be unique within vserver, vservers must be unique within clusters etc.)

WFA also has a fair amount of capability built-in that would need to be replicated, including sequential evaluation and 'reserving' elements that have yet to be provisioned to allow a degree of concurrency (which provides for performance headroom)

For any Cluster or Storage Controllers (7-mode), it's usually a good idea to ensure there is only one method of provisioning storage.

Hope that helps!

cheers,

Michael.

hill
4,625 Views

Hi,

I'll chime in as well, even though Rick and Michael provided the same answer I will, with a slight twist.  I'll sum it up in two different ways:

  1. It depends on how badly you want to 'roll your own'
  2. It depends on how much you want to use "certified/supported" automated processes.

So... for the first one, this is where you'd basically do everything via Chef.  Making your own front-end, how it should look/feel to your customers, as well as any Storage or end-to-end manageability processes you want to code in... code in every single line of automation, validation, error collection, and so on.

     Advantages:

    • You own everything

     Disadvantages:

    • You own everything... down to the last nut and bolt.  This is not saying NetApp will not deal with a direct ONTAP issue, but the inputs, timing, validation and error correction from the UI initiating against ONTAP will definitely be suspect.


For the second method, you could simply use Chef for your front end and identify the WFA workfows to call via REST.... and in this way, use WFA as your engine to initiate your storage... or your end-to-end automation tasks.  There are limitless ways to identify what WFA does or does not do... that choice is totally up to you.

     Advantages:

    • you don't own everything.
    • you create your own look/feel and call an engine for automation
    • ensure your processes follow the same validations & error corrections
    • leverage certified/supported content available within the 'engine' (i.e. able to get support)
    • ability to pull in data / information to facilitate intelligent automation
    • ability to automate beyond storage if necessary

     Disadvantages:

    • dependency on a vendor product for your automation
    • some of the certified content might not meet your needs... so you need to create your own (community supported) content.

So... While I am heavily weighted toward using WFA as an engine (which I do today... with a different front-end), the choice is totally up to you.  There are many advantages to utilizing the WFA database and it's "cache" of information it can pull from a variety of datasources.  That said, I've worked with several customers that identify reasons they "can't" use WFA, and decide to roll their own... typically with a variety of python scripts and leveraging the NMSDK.

Just trying to provide a different context.  WFA is very, very useful, and should meet your needs... but if you decide to 'roll your own', then you have that option as well.

Good luck in your decision!

Kevin


Public