You can use the exec
command to redirect output from the rest of the script, just as you would on an individual command. So adding something like exec >>~/testbackupnoon.log 2>&1
at the beginning of the script will append all output (both stdout and stderr) to testbackupnoon.log in the current user's home directory. If you want the output sent to the file and the terminal, use something like exec > >(tee -a t.log) 2>&1
instead.
Assuming you're going to be appending to the file (as I suggest above), it'd probably also be good to add some general info like timestamps when the operation started and finished, and probably a blank line between runs. I'd also add some error checking in the body of the script, so e.g. if the copy to S3 fails, it doesn't go ahead with deleting the local backups. So something like this (warning: this has NOT been properly tested):
#!/bin/zsh
# start logging
exec > >(tee -a t.log) 2>&1
date -u +"%n%Y-%m-%dT%H:%M:%SZ backup to S3 started" >&2
/usr/local/bin/aws s3 cp /Library/FileMaker\ Server/Data/Backups/S3/noon_db_* s3://testbackupnoon || {
echo "Copy to S4 failed; local backups will be left" >&2
exit 1
}
rm -rf /Library/FileMaker\ Server/Data/Backups/S3/*
date -u +"%Y-%m-%dT%H:%M:%SZ backup to S3 completed" >&2