Assume: On Linux server we have limited free space. The task is to create archive of a folder (for example /var/www), encrypt it and upload to remote location keeping original unencrypted file in place. Everything works fine for relatively small files, but once archive file reaches >50% of free space in size, there is not enough space to place encrypted file. The upload is done via SDK from provider and it is not possible to pipe output from gpg-agent directly to upload. All of above is done by self-written bash script.
Therefore actions step-by-step (with example numbers):
- Free storage in (for example) /opt/ is 100GB
- Compress /var/www as /opt/www.tar.gz (file size is 60GB)
- Somehow produce /opt/www.tar.gz.gpg file (it would need 60GB or more)
- Upload /opt/www.tar.gz.gpg using CLI tool
- After all above is done there should be left file /opt/www.tar.gz
Is there a solution to this problem?