I have right now somthing like this. This function is part of Bash script file. Inside of this function I call many custom functions. Not do complicated. For example lenght just checks file name against string rules. Every function that I add, makes script much slower. Tested on 300 files. Simple find with just echo file_name, less then second. With all functions takes 0h:0m:11s. I know there is not enough info, but still, how can I make this faster.
On live i have do loop 20 miljon files.
function initDatabase {
dir="$@"
# check dir is not empty
if [ ! -z $dir ]
then
find $dir -type f -print0 | while IFS= read -r -d '' FILE
do
error=0
out=''
#FUNCTION validates file name
out=$(lenght)
if [ ! -z "$out" ]
then
echo -e "${NC}${BLUE}Fail on vigane"
echo -e "${RED}$out${NC}"
echo "erro" >> $LOG_FILE_NAME
echo "$out" >> $LOG_FILE_NAME
error=1
fi
if [ $error == 0 ]
then
#get file name and directory
f=${FILE##*/}
f_dir="${FILE%/*}"
changed=$(stat -c%Y $FILE)
## checks if file is pyramid tiff
pyramid="false"
out="$(multi $FILE)"
if [ "$out" == 1 ]; then pyramid="true"; fi
#FUNCTION removes zeros from beginning
prop2=$(removeZeros "$(echo $f | cut -d'_' -f1 | cut -c4-)")
#Get part count
part_count=$(grep -o "_" <<<"$f" | wc -l)
fi
done
else
echo "ERROR:"
fi
}