3

I am trying to copy large sets of files (in 100's) to my remote server. However, using 'copy' command this task is taking considerably long time. Searching a bit, I understood the 'synchronize' is a good suit for this. Unfortunately my current remote servers do not have 'rsync' and so I am not able to use the 'synchronize' option as well.

As a workaround, I wanted to zip the folder in ansible host and then transfer it to the remote server by using the 'unarchive' module.

- name: Archive the folder
  shell: zip <dest-zip-path> <path-to-folder>
  delegate_to: localhost

However, doing this I am getting the following error: "module_stderr": "sudo: a password is required\n"

Is there a simpler to zip the folder locally on the ansible host before transfering?

akash
  • 864
  • 8
  • 13

2 Answers2

4

Based on the solution posted by Zeitounator, this was the Ansible code I used to solve the issue:

- name: Archive the files
  archive:
    path: <path-to-folder>
    dest: <dest-zip-path>
    format: zip
  delegate_to: localhost
  become: false
akash
  • 864
  • 8
  • 13
  • I'm in a similar situation: I need to zip a file before to upload it to aws S3... I understand the task, but I'm confused about the need of delegate_to.... in my case, my host file contails already only [localhost]... would I still need the delegate_to? – FiNaR Feb 28 '23 at 05:10
3

You are probably using become: true on your play. This also applies when delegating to localhost. But it requires a password on your local machine.

Since you probably don't need this, simply apply become: false to this specific task. Else you will have to configure privilege escalation on localhost or provide a become_password.

Moreover, you should consider using the archive module rather than using shell.

Zeitounator
  • 38,476
  • 7
  • 53
  • 66