Running macosx 10.6.2, I am seeing some extremely weird behavior with a script which repeatedly calls curl -o (file) and then greps for a certain string within it. Occasionally grep returns 1 (not found) when I'd expected 0 (found). Here's the script...
# do this 1000 times
for ii in `cat count.txt`; do
rm -f a.txt
rm -f e.txt
curl --fail --stderr e.txt -j -o a.txt -s $MYURL
if [ -e a.txt ] ; then
# Occasionally a.txt doesn't finish writing on time
grep "login-url" a.txt >/dev/null
LASTERR=$?
echo $LASTERR is lasterr grep 1
if [ "$LASTERR" -ne "0" ] ; then
cp a.txt anomaly.txt
sleep 1
echo "Sleeping..."
fi
grep -q "login-url" a.txt >/dev/null
LASTERR=$?
echo $LASTERR is lasterr grep 2
if [ "$LASTERR" -ne "0" ] ; then
echo "Dying..."
exit 1
fi
# This is what I actually want to do
grep "login-url" a.txt >> out.txt
fi
done
What I see is something like this:
0 is lasterr 1
0 is lasterr 2
...
0 is lasterr 1
0 is lasterr 2
0 is lasterr 1
1 is lasterr 2
In other words, a.txt is changing (as far as grep can tell) between the two greps!
Has anyone else seen the like?
I note if I put in a "sleep 1" after the curl call, the issue goes away. So is there a problem with re-using the same file name over and over, or is curl returning before it is done writing, or...?
This is not a crisis issue because of the "sleep 1" workaround, but I am nervous because I don't understand the behavior.