1

I am currently trying to get into ansible and for that usecase i have setup a cluster of 3 VMs using VirtualBox and Vagrant. Now my VM-Setup looks like this

Vagrantfile

$inline_m1 = <<SCRIPT
yum -y update

yum install -y git
yum install -y ansible

SCRIPT

$inline_n1_n2 = <<SCRIPT
yum -y update

yum install -y git

SCRIPT

Vagrant.configure(2) do |config|
 config.vm.define "master1" do |conf|
    # conf.vm.box = "peru/my_centos-7-x86_64"
    # conf.vm.box_version = "20181211.01"
    conf.vm.box = "centos/7"

    conf.vm.hostname = 'master1.vg'
    conf.vm.network "private_network", ip: "192.168.255.100"
    conf.vm.provider "virtualbox" do |v|
        v.memory = 6144
        v.cpus = 2
    end
    conf.vm.provision "shell", inline: $inline_m1
    conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc/hosts"
    conf.vm.provision "file", source: "./master1/etc.ansible.hosts", destination: "~/etc/ansible.hosts"
 end

 config.vm.define "node1" do |conf|
    conf.vm.box = "centos/7"
    conf.vm.hostname = 'node1.vg'
    conf.vm.network "private_network", ip: "192.168.255.101"
    conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc/hosts"
    conf.vm.provision "shell", inline: $inline_n1_n2
 end

 config.vm.define "node2" do |conf|
    conf.vm.box = "centos/7"
    conf.vm.hostname = 'node2.vg'
    conf.vm.network "private_network", ip: "192.168.255.102"
    conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc/hosts"
    conf.vm.provision "shell", inline: $inline_n1_n2

 end
end

so it is 1 Master and 2 Nodes. The master is supposed to have ansible installed and access the nodes via ssh. So all machines are up and runnin and I can connect to my master using

vagrant ssh master1

I also have my modified etc/hosts so i can reach master1.vg, node1.vg etc.

But there is one problem. I am supposed to connect via ssh to the nodes from inside the master. but

ssh node1.vg

will not work as permission is denied after asking for a password. according to the documentation the default password should be "vagrant" but this is not the case here. (I guess as the access method is already set to ssh with a key). I have googled for quite a bit as I thought this would be a common question but found no satisfiing answers. Do you have any idea how to make a connection via ssh from master1 vm to one of the node vms?

I've also uploaded the config to a repo (https://github.com/relief-melone/vagrant-ansibletestingsetup)

relief.melone
  • 3,042
  • 1
  • 28
  • 57
  • Just to clarify if you try for example `ssh vagrant@127.0.0.1 -p {port your vagrant machine is on}` do you still receive a permission denied? – Kevin Dec 30 '18 at 22:54
  • Yep, if I'm on master ill get a connection refused with "ssh vagrant@127.0.0.1 -p .." and also with "ssh vagrant@192.168.255.101" -p ... when I just use "ssh 192.168.255.101 I'll get "Authentication can't be established ECDSA key fingerprint is... Are you sure you want to continue... I click yes and get a "Permission denied (publickey, gssapi-keyex.gss-with-mic). – relief.melone Dec 30 '18 at 23:13
  • Great question, I am stumped as well. Try tracking down the errors of where the connection refused by using `ssh -vvv ` when trying to connect. – Kevin Dec 30 '18 at 23:21
  • I'll put this in a different comment so what I was able to do from outside the vm now is to connect without using vagrant. for that i do "vagrant ssh-config node1" i note [Port], [User] and [IdentityFile] and what will work after that is "ssh -i [IdentityFile] -l [User] -p [Port] localhost" but only with localhost 127.0.0.1 or 192.168.255.101 will fail with connection refused. And also with localhost I will get the message that the authenticity of the host could not be established and i have to confirm i still want to connect – relief.melone Dec 30 '18 at 23:21

1 Answers1

0

OK I solved it now. Now Vagrant will generate your private keys you will need to get that key into your master VM with the correct permissions. You will also need to set upo your network correcty. So lets first tackle the network point.

Your /etc/hosts will have to be set up. In my setup it will look like this

/etc/hosts

192.168.255.100 master1.me.vg
192.168.255.101 node1.me.vg
192.168.255.102 node2.me.vg

Your private keys will be stored in ./.vagrant/machines/nodeX/virtualbox/private_key. You will need all the nodes you want to access from your master so this leaves us with the following

Vagrantfile

Vagrant.configure(2) do |config|

   config.vm.define "node1" do |conf|
      conf.vm.box = "centos/7"
      conf.vm.hostname = 'node1.me.vg'
      conf.vm.network "private_network", ip: "192.168.255.101"

      conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc.hosts"

      conf.vm.provision "shell", path: "./node/shell.sh"

   end

   config.vm.define "node2" do |conf|
      conf.vm.box = "centos/7"
      conf.vm.hostname = 'node2.me.vg'
      conf.vm.network "private_network", ip: "192.168.255.102"

      conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc.hosts"

      conf.vm.provision "shell", path: "./node/shell.sh"

   end
   config.vm.define "master1" do |conf|
      conf.vm.box = "centos/7"

      conf.vm.hostname = 'master1.me.vg'
      conf.vm.network "private_network", ip: "192.168.255.100"

      conf.vm.provider "virtualbox" do |v|
          v.memory = 6144
          v.cpus = 2
      end
      conf.vm.provision "file", source: "./etc.hosts", destination: "~/etc.hosts"
      conf.vm.provision "file", source: "./master1/etc.ansible.hosts", destination: "~/etc.ansible.hosts"
      conf.vm.provision "file", source: "./.vagrant/machines/node1/virtualbox/private_key", destination: "~/keys/node1"
      conf.vm.provision "file", source: "./.vagrant/machines/node2/virtualbox/private_key", destination: "~/keys/node2"

      conf.vm.provision "shell", path: "./master1/shell.sh"
   end

end

At last you will have to set the permissions of the private keys as a too open permission set will be rejected on ssh later. My shell files look like this

./master1/shell.sh

yum 

-y update

yum install -y git
yum install -y ansible

cp /home/vagrant/etc.hosts /etc/hosts
cp /home/vagrant/etc.ansible.hosts /etc/ansible/hosts

chmod 600 /home/vagrant/keys/*

./node/shell.sh

yum -y update

yum install -y git

cp /home/vagrant/etc.hosts /etc/hosts

After all that is done

vagrant up

should run smoothly and you can go to your master vm using

vagrant ssh master1

in that master you can now connect to e.g. the node2 machine using

ssh -i ~/keys/node2

As this is a set with quite an amount of files I also put this into a repo which can be found here

https://github.com/relief-melone/vagrant-ansibletestingsetup/tree/working-no-comments

relief.melone
  • 3,042
  • 1
  • 28
  • 57