Active IQ Unified Manager Discussions

Cannot get Datasets to be in conformance with OnCommand Host Package and VMware

dgary
13,467 Views

I have the OnCommand Host Package installed in vCenter.  I'm trying to create a backup job of a datastore.  I create a policy and dataset and select the datastore I want to backup as a resource.  It all seems correct.  Whenever it runs I get a conformance error of:

Communications failure with host service ONCOMMAND (1339). Exception: Push operation (1448,1339) failed to push updates to HS. Reason: Communications fault in update of Host Service: SOAP 1.2 fault: SOAP-ENV:Sender[no subcode] "Exception of type 'NetApp.HostServices.SmException' was thrown.-Value does not fall within the expected range.-An error occurred when calling create job web service-An error occured while processing AppDataset,Policy update-Exception of type 'NetApp.HostServices.SmException' was thrown.-Value does not fall within the expected range.-An error occurred when calling create job web service" Detail: 00Exception of type 'NetApp.HostServices.SmException' was thrown.-Value does not fall within the expected range.-An error occurred when calling create job web service-An error occured while processing AppDataset,Policy update0001-01-01T00:00:000Unknown

Push dataset New Dataset (1448) configuration to host service ONCOMMAND
What am I doing wrong?  This worked great with VSC which has been uninstalled because of the new OnCommand installation.
Please advise, this error message could not be more cryptic.
17 REPLIES 17

kennethostnes
13,384 Views

Hi,

Did you find an solution to this problem? I have the same problem.

adaikkap
13,384 Views

Can you give more details about your environment like

dfm version

Host Package Version

vSphere version

And the exact error message

Regards

adai

kennethostnes
13,383 Views

Hi,

We are running the following versions:

- dfm version Oncommand 5.02

- Host Package version 1.1

- Vshere/Vcenter version 5 with latest updates

See the attached jpg file for the error message.

Regards

Kenneth

svijay
13,384 Views

Hi,

Greetings !!!.

These are a few things you may want to try:

1.Your HS service might be down / unavailable. You may want to check whether the HS service is Up.

2.If you are hitting this error when you are trying to create a monthly schedule, You could try creating a Weekly schedule & check if you are able to conform.

also if possible pls provide the outputs for the following commands:

# dfpm policy schedule contents

Regards,

Vijay

kennethostnes
13,384 Views

Hi,

Thanks,

Answers to your questions:

1. The HS service seems fine and are up and running. See my earlier reply.

2. We are hitting this error on all schedules. Have tried the following:

- Local backups to primary with built-in- policies and schedules (templates).

- Local backups to primary made my own policies (based on templates) and schedules.

- Local backups to primary and backup to secondary with use of storage service

Regards

Kenneth

kryan
12,372 Views

This screenshot appears to have been taken from the NMC, however virtual datasets are managed in the WebUI. 

Was this virtual dataset created in that UI? 

In addition, the protection policy is assigned via a storage service.  Here is an example:

1. If there is a need to protect the dataset with more than a local VM style policy:
- Using the NMC, create a storage storage service that has the appropriate Protection Policy attached.  The transfer schedule and retention settings are attached to the policy.
- NMC->Policies->Storage Services. Click on Add. Select the appropriate options, resource pools, provisioning policies, etc.

*** Use the OnCommand Console UI for the remaining steps ***


2. Create a dataset (Datasets -> Create -> pick the dataset type) and attach the storage service created in step #1.  Select the Local Policy as appropriate.  This dataset will now be visible in the OnCommand Console.
NOTE: It is not possible to add Virtual Servers to the dataset via the NMC.

3. The dataset will show as type undefined. Edit the dataset and select "vmware".

4. Allow conformance to initialize the dataset.

5. Resolve any errors that arise.

kennethostnes
12,373 Views

Hi,

That is correct the picture was taken from the NMC console. Here is one from the OnCommand Web.

The dataset was created from the OnCommand web interface, not from the NMC.

It is the meaning to get the dataset up and running with a backup to secondary storage with the use of

a storage service with an assigned protection policy. But at the moment I only test with a local policy, since I got this error.

When I used a storage service with an assigned protection policy it did the initialize (established the snapvault relationship to

secondary) of the dataset and after finished initializing the same error message came up.

Regards

Kenneth

kennethostnes
13,384 Views

Hi again,

This is the error message from dfm.txt log on the DFM server.

Jul 24 12:34:07 [dfmserver:ERROR]: [1824:0x159c]: The operation failed on host service 12VMM01 (421).

Reason:

Communications fault in update of Host Service: SOAP 1.2 fault: SOAP-ENV:Sender[no subcode] "Could not create task, please check if name of the task is missing--An error occurred when calling create job web service-An error occured while processing AppDataset,Policy update-Could not create task, please check if name of the task is missing--An error occurred when calling create job web service"

Detail:

<SMFault xmlns="http://www.netapp.com/hostservices/v1/common" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><MessageId>0</MessageId><LogLevel>0</LogLevel><Source i:nil="true"/><Category i:nil="true"/><MessageDetail>Could not create task, please check if name of the task is missing--An error occurred when calling create job web service-An error occured while processing AppDataset,Policy update</MessageDetail><TimeStamp>0001-01-01T00:00:00</TimeStamp><ResultCode>0</ResultCode><ResolutionGuidance i:nil="true"/><Detail/><Messages i:nil="true"/><OperationId/><OperationStatus>Unknown</OperationStatus></SMFault>

Regards

Kenneth

pukale
13,384 Views

Hi,

Please check for storage mapping for VM. i.e VM datastore filer credentials are set in DFM.  To check this run below command.

1. dfm hs list 

[root@boxhole-vm2 release_ibm]# dfm hs list

here, HS status should be "up".

Id         Host Name                                Host Address         Version    Status                 Timezone

---------- ---------------------------------------- -------------------- ---------- ---------------------- ----------------------------------------

1192       boxhole-vm1                              10.60.231.60         1.2.0.1518 up                     GMT+4:00(4 hours West of UTC).

2.  dfm hs  controller list <HSID>

controller on which yours VM is present, should have status as good.

Here is example:

[root@boxhole-vm2 release_ibm]# dfm hs  controller list 1192

Id         Controller Name                          Access Protocol Login Status (Host Service) Login Status (Server)

---------- ---------------------------------------- --------------- --------------------------- ---------------------

151        sin.rtp.netapp.com                       https           good                        good

If not set, you can set using below commands.

example:

#dfm host set hostLogin=root hostPassword=xxx sin.rtp.netapp.com 

# dfm hs controller  setlogin 1192 151                      <<<< where 1192 is HS id and 151 is filer ID

thanks

santosh

kennethostnes
13,384 Views

Hi,

The storage mapping for VM is fine and the filer credentials are set.

- dfm hs list - shows id (421), the correct host name, the correct host ippadres, version (1.1.0.1512) with status up

- dfm hs controller list 421 - shows that all three storage controllers with https access, good and good.

Regards

Kenneth

svijay
13,384 Views

Hi Kenneth,

I would suggest you to open a case with NetApp Support if you haven't already done.

Regards,

Vijay

kennethostnes
12,373 Views

Hi Vijay,

I have not opened a case yet, but I gone do that.

Regards

Kenneth

kjag
12,373 Views

Hi Kenneth,

Could you please paste the output of 'dfm hs diag <HS Id or Name>'

-KJag

kennethostnes
12,373 Views

Hi,

This is the result from dfm hs diag command.

(Have replaced the ip-adress with x.x.x.x)

C:\>dfm hs diag 421

Network Connectivity

IP Address                    x.x.x.x

FQDN                          12VMM01.xxxx

Admin Port                    8699

HTTPS                         Passed (30 ms)

Plugin Reachable              Yes

DataFabric Manager server configuration

Port                          8488

IP Address/DNS                x.x.x.x

DFM Reachable                 Yes

Acording to:                  DataFabric Manager server              Host Service

Management Port               8799                                   8799

Host Service version          1.1.0.1512                             1.1.0.1512

Plugin Information

Plugin version                1.1.0.0                                1.1.0.0

Plugin Type                   OnCommand Host Service VMware Plug-in  OnCommand Host Service VMware Plug-in

Regards

Kenneth

kennethostnes
12,373 Views

Hi,

I am still struggeling with this problem. Have upgraded to OnCommand Core 5.1 and OnCommand Host 1.2. Have also opend a case with Netapp in july.

Regards

Kenneth

adaikkap
10,028 Views

Can you pls pass on the Case # ?

kennethostnes
10,028 Views

The Case number  is xxxxxxx, but we are going to close the case. The problem is fixed by installing a new Vcenter/Vshere server and moving all

the ESX and VM to the new server. All Datasets is confirmed and works fine with local (local policy) and remote backups (Storage service).

Restore, mount and unmounts of dataset backup have been tested.

/Kenneth

Public