1

I use ansible module fetch to download a large file, said 2GB. Then I got the following error message. Ansible seems to be unable to deal with large file.

fatal: [x.x.x.x] => failed to parse: 
SUDO-SUCCESS-ucnhswvujwylacnodwyyictqtmrpabxp
Traceback (most recent call last):
  File "/home/xxx/.ansible/tmp/ansible-tmp-1437624638.74-184884633599028/slurp", line 1167, in <module>
main()
  File "/home/xxx/.ansible/tmp/ansible-tmp-1437624638.74-184884633599028/slurp", line 67, in main
data = base64.b64encode(file(source).read())
  File "/usr/lib/python2.7/base64.py", line 53, in b64encode
encoded = binascii.b2a_base64(s)[:-1]
MemoryError
VincentHuang
  • 406
  • 1
  • 6
  • 21

3 Answers3

3

https://github.com/ansible/ansible/issues/11702

This is an Ansible bug which have been solved in newer version.

VincentHuang
  • 406
  • 1
  • 6
  • 21
1

Looks like the remote server you're trying to fetch from is running out of memory during the base64 encoding process. Perhaps try the synchronize module instead (which will use rsync); fetch isn't really designed to work with large files.

nitzmahone
  • 13,720
  • 2
  • 36
  • 39
0

Had the same issue with a Digital Ocean droplet (1 Gb RAM). Fixed it by increasing the swap size.

Here is the ansible task to fetch the data

    - name: Fetch data from remote
      fetch:
        src: "{{ app_dir }}/data.zip"
        dest: "{{ playbook_dir }}/../data/data.zip"
        flat: yes
        become: no
      tags:
        - download

Used this playbook to do the swap increase with ansible

Levon
  • 10,408
  • 4
  • 47
  • 42