I am trying to find the biggest files in the /export/home directory and add up (sum) their sizes.
Script:
#!/bin/bash
filename=hostnames
> export_home.log
while read -r -a line
do
hostname=${line//\"}
echo $hostname":" >> export_home.log
ssh -n -t -t $hostname "sudo find /export/home -type f -mtime +180 -size +250000k -exec du -hsk {} \;" >> export_home.log
done < "$filename"
Example output:
server-34:
210M /export/home/john/142933-02/failsafe_archive
178M /export/home/john/137137-09/failsafe_archive
server-35:
server-36:
142M /export/home/marc/bdb/db-5.2.36.tar
446M /export/home/marc/sunfreeware/git/git-1.7.6-sol10-x86-local
1.4G /export/home/marc/mysql/mysql-5.5.16-solaris10-x86_64.tar
1.1G /export/home/marc/mysql/mysql-5.5.16-solaris10-i386.tar
server-37:
This script does perfectly what it should do, but now how do I also get the total size of ALL files that are found based on the results in the export_home?
I am planning to make a few adjustments for this script to find the total size of log-directories and local back-up directories in order to get a better insight in combined disk-usage over multiple servers. I am not sure how I would be able to find a total disk usage.