ONTAP Discussions

unable to SSH to Cluster mgmt - ONTAP 9

gbcob77
23,580 Views

Hello,

 

I have setup a 2 node cluster across 2 VM's - I attempt to SSH into the mgmt interface via PuTTY and I receive "Error: Connection Refused" - I am able to ping the address assigned to mgmt interface. I am using the admin account and as far as I Know the access rights for SSH are enabled by default? is there somthing else I must to do in order to SSH to this?

 

 

any help/advice is appreciated.

 

Thanks

1 ACCEPTED SOLUTION

MARIOWEISE
14,514 Views

Found the case and the solution:

 

"Hello,

Thanks for the logs.

We have been able to identify the known issue : https://mysupport.netapp.com/site/bugs-online/product/ONTAP/BURT/1118890

Our recommendation is perform an upgrade to a fixed version."

 

So an update it is ...

 

Regards

Mario

View solution in original post

11 REPLIES 11

Sahana
23,383 Views

Hi,

 

SSH is enabled by default. You need to use the cluster management ip address. Refer http://docs.netapp.com/ontap-9/index.jsp?topic=%2Fcom.netapp.doc.dot-cm-sag%2FGUID-8377119E-2B13-42B5-A041-0D0E1B82CBB9.html

If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

sivakarthik
23,363 Views

We have recently installed 3 clusters and faced the same problem. Some times SSH access is not enabled by default for "admin". Login to cluster GUI using https://cluster_mgmt_IP.

 

 

Go to Configurations --> Users --> select 'admin' --> click edit --> add --> select SSH in Application dropdown list --> select admin in Role dropdown list --> click OK --> modify.

 

This should resolve your access issue.

Rinku02Bansal
23,194 Views

Hello,

 

I have setup a 2 node cluster across 2 VM's - I attempt to SSH into the mgmt interface via PuTTY and I receive below error . I am able to ping the address assigned to mgmt interface & also able to take ssh from node interface address but not from mgnt address.  is there somthing else I must to do in order to SSH to this?

 

You are accessing ViPR. By using this system you consent to the owning organization's terms and conditions.   <-- Every Time is saying Access Denied 
Using keyboard-interactive authentication.
Password:
Access denied
Using keyboard-interactive authentication.
Password:

 

Configuration for cluster : -

 

Logical Status Network Current Current Is
Vserver Interface Admin/Oper Address/Mask Node Port Home
----------- ---------- ---------- ------------------ ------------- ------- ----
Cluster
cluster90-01_clus1
up/up 169.254.3.43/16 cluster90-01 e0a true
cluster90-01_clus2
up/up 169.254.3.53/16 cluster90-01 e0b true
cluster90-02_clus1
up/up 169.254.102.78/16 cluster90-02 e0a true
cluster90-02_clus2
up/up 169.254.102.88/16 cluster90-02 e0b true
cluster90
cluster90-01_mgmt1
up/up 192.168.32.65/24 cluster90-01 e0c true
cluster90-02_mgmt1
up/up 192.168.32.66/24 cluster90-02 e0c true
cluster_mgmt up/up 192.168.32.64/24 cluster90-01 e0d true
7 entries were displayed.

 

 

any help/advice is appreciated.

 

Thanks

MarcSchindler
20,607 Views

Did you find a solution or does the issue still exits?

I would check first the firewall-policy

::> net int show -vserver -lif svm_lif_mgmt -fields firewall-policy,address

vserver                  lif address        firewall-policy

--------------          ------------------ ------------ 

svm_lif_mgmt  <IP>                   mgmt    <--- be aware, data is false for this use case

drayfus
18,668 Views

 

 

 

MarcSchindler
18,602 Views

@drayfus wrote:

 

 

 


Any question? I see only an empty post.

 

Brgds, Marc

JuliX
17,918 Views

Hello everyone,

 

I have a similar problem as described before. As setup I have a two node cluster. If the cluster mgmt ip is hosted on the e0M port of the first controller, I am able to ping the ip address but can not login via ssh. I get the rror message "connection refused...". If I migrate the interface the interface to the e0M port of the second controller. I am able to login via ssh?

So the node mgmt ip address on the first node doesnt too. The strange thing is, that I am able to login via ssh to the service processor of the first node, which is the same physical port (e0M).

Anyone here which have the same problem?

 

kind regards

JuliX

MARIOWEISE
16,384 Views

Hello,

 

I have the same problem with a cluster (part of a 2-node metrocluster) running ONTAP 9.3P5 as well. Firewall Policies are the same on both MCC cluster, Mgmt-LIFs have the correct firewall policy (mgmt) attached and the 2nd cluster works as expected.

 

What I tried so far:

SSH connection from different Clients -> SSH does not work on any client

ping to cluster_mgmt and node_mgmt -> works

https to system manager via cluster-mgmt -> works

ssh to both interface -> doesn't work

migrating the LIFs away from e0M to a VLAN tagged ifgrp -> doesn't help

bringing the interfaces down and up again -> doesn't help

disable firewall on node -> doesn't help

 

Opened a support ticket with NetApp.

 

Regards

Mario

drayfus
16,346 Views
I was only able to track down this issue after going into the systemshell
from diag mode and poking around in the log files. It turned out that the
underlying user id was wrong and different on the separate nodes. One node
had the user id as X and the other node had it as Y. Once I corrected
that, ssh worked properly.

Doug

netappnic
14,589 Views

Hi Mario,

 

I've got exactly the same problem as you..

 

Have you any solution from netapp support?

 

Thanks & best regards,

Nicolas

MARIOWEISE
14,515 Views

Found the case and the solution:

 

"Hello,

Thanks for the logs.

We have been able to identify the known issue : https://mysupport.netapp.com/site/bugs-online/product/ONTAP/BURT/1118890

Our recommendation is perform an upgrade to a fixed version."

 

So an update it is ...

 

Regards

Mario

Public