I've built an SQL migration script that will replace text in a MySQL dump file. Replacements would be easy with sed
, but we must account for special serializations (PHP Serialize). The code below works but the awk
command is very expensive. With a quick google search, I see many developers with the same runtime performance issues. This article breaks down how simple changes can make a large difference. If anyone knows how I can speed this command up, or faster alternatives, that would be amazing :)
#!/usr/bin/env bash
set -e
SQL_FILE="$1"
maindomain="$2"
localdomain="$3"
sed 's/;s:/;\ns:/g' "$SQL_FILE" | \
awk -F'"' '/s:.+'$maindomain'/ {sub("'$maindomain'", "'$localdomain'"); n=length($2)-1; sub(/:[[:digit:]]+:/, ":" n ":")} 1' | \
sed -e ':a' -e 'N' -e '$!ba' -e 's/\n/ /g' | \
sed "s/$maindomain/$localdomain/g" > "$SQL_FILE.txt"
rm "$SQL_FILE"
mv "$SQL_FILE.txt" "$SQL_FILE"