I have a folder of files and sub folders with some template files which needs to be deployed in a remote machine, but before deploying these files & folders I need to backup these in the control system (where ansible is running). The number of files which needs to be deployed is not fixed, it might change. Is it possible to do this in ansible?
Below is my ansible configuration and global variable file which works till extract step.
---
#Global vars
aws_access_key: xxx
aws_secret_key: xxx
aws_bucket: s3-artifacts
aws_bucket_download: /deployment/trial/deployment.tar.gz
local_download: deployment.tar.gz
current_dir: /root/tools/deployment/trial/
local_extract_dir: extracted
deployment_usr: root
deployment_grp: root
mode: 775
...
---
- hosts: web
tasks:
- name: Download from S3
s3: aws_access_key="{{ aws_access_key }}" aws_secret_key="{{ aws_secret_key }}" bucket="{{ aws_bucket }}" object="{{ aws_bucket_download }}" dest="{{ current_dir }}/{{ local_download }}" mode=get overwrite=true
#Change permissions
- name: Change permissions
file: path="{{ current_dir }}/{{ local_download }}" owner="{{ deployment_usr }}" group="{{ deployment_grp }}" mode="{{ mode }}"
#Extract
- name: mkdir local_extract_dir
file: path="{{ current_dir }}/{{ local_extract_dir }}" state=directory mode="{{ mode }}"
- name: Extracting deployment folder
unarchive: src="{{ current_dir }}/{{ local_download }}" dest="{{ current_dir }}/{{ local_extract_dir }}"
- name: Give permissions
file: path="{{ current_dir }}/{{ local_extract_dir }}" owner="{{ deployment_usr }}" group="{{ deployment_grp }}" mode="{{ mode }}"
#Backup
#- name: Synchronize
#synchronize: src="{{ current_dir }}/{{ local_extract_dir }}"
...
I also tried running (cd extracted; find .) | cut -c 3-
command which gives the list of extracted files and folders but need help in taking backup of these files from remote machine.