ssh between nodes problem
I run through some problems with ssh lately... Please find logs attached to this topic. You can see there one serious problem actually. Refuse to connect by ssh.
1. start-domain, change-admin-password, enable-secure-admin, restart-domain
2. keygen dsa and rsa keys, put id_dsa.pub and id_rsa.pub to /.ssh/ (I am using root, so user.home is /) in a template virtual machine which will be cloned
3. When cloned virtual machine is booted I am executing script which will do this:
cat /mnt/cust/id_dsa.pub | tee -a /.ssh/known_hosts > /dev/null
cat /mnt/cust/id_dsa.pub | tee -a /.ssh/authorized_key > /dev/null
cat /mnt/cust/id_dsa.pub | tee -a /.ssh/authorized_key2 > /dev/null
so basically, I have got private and public keys, public keys are distributed to all created nodes, nevertheless connection refused is thrown. If I ssh from command line everything is ok.
EDIT: I noticed double '/' in logs by id_rsa key:
[#|2012-07-07T10:28:47.991+0200|WARNING|44.0|javax.enterprise.system.tools.admin.com.sun.enterprise.v3.admin.cluster|_ThreadID=21;_ThreadName=Thread-2;|Could not connect to host 10.0.2.18 using SSH.: There was a problem while connecting to 10.0.2.18:22: Connection refused: host=10.0.2.18 port=22 user=root password=null keyFile=//.ssh/id_rsa keyPassPhrase=null authType=null knownHostFile=/.ssh/known_hosts|#]
Could it be causing the problem?