0

I've been working on this project where my goal is to manage installs and updates on a park of Ubuntu nodes using Ansible.

What is the issue ?

If I want to connect to other nodes using Ansible, I need SSH access to them. Which means I need to share my pub_key. I would like to automate that process for a brand new workstation setup.

What did I try so far ?

I've thought about shifting the initial .iso installer (using Cubic) and add an admin_user. This way I could connect to the managed node using this admin's user access and configure the ssh key. However, the user's installation process get overwritten by my user, and the machine ends up with only the admin_user, whilst the user created during the installation process is ignored.

Thanks in advance

  • It really depends on how you want to do your initial deployment. For example https://ubuntu.com/server/docs/install/autoinstall leans on cloudinit. Cloudinit has many ways to configure/adjust newly installed systems and allows you to easily set up an custom user for ansible, make them member of the `sudo` group to grant sudo rights and can add your public_key in the `~/.ssh/authorized_keys` file of that user . See these [examples](https://cloudinit.readthedocs.io/en/22.4.2/topics/examples.htmll) – diya Jan 11 '23 at 15:57
  • I have read about cloud-init and discovered that it allows VMs to autoinstall, thus being able to configure a sudo user. However, I haven't read any infos about real workstation (which is the end goal). Is it applicable to a real life environment as well ? If not are there other solutions ? Does it work on Ubuntu 22.04 ? Thanks ! – Mr. Folder Jan 13 '23 at 17:05

0 Answers0