I have a few data text files whose lines are values with their error, like these:
...
6 90.3785794742981 0.0952997386139722
40 1028.46336161948 4.41798447319325
...
the third column is the error relativley to the second column. I would like to write a script that prints them in a more human-readable format, that is to print the value with the right number of significant digits and to print the error on the last two digist between parenthesis, like this:
...
6 90.379(95)
40 1028.5(4.4)
...
using regular expressions to extract the numbers wouldn't work right because of difficulties of handling the dot and because it would truncate numbers rather than approximate them, so i thought that i'd rather retrieve their magnitude with printf
and handle them with bc
.
The code I wrote for this is as follows
#! /bin/bash
while read a v verr
do
ov=`printf %e $v`
ov=${ov/*e/}
overr=`printf "%e" $verr`
overr=${overr/*e/}
dov=$((1-$overr))
v=`echo "scale=0;$v*10^($dov)" | bc -l`
v=`printf %.0f $v`
printf "$a %f(%.0f)\n" `echo "lenght=length($v);$v*10^($((-$dov)))" | bc -l` `echo "$verr*10^($dov)" | bc -l`
done < myfile.txt
what i get is
6 90.379000(95)
40 1028.500000(44)
My code almost works, except for the appearence of those trailing zeroes.
How do I get rid of them? Just cutting them would not be good because their number is not fixed and cutting them all would give rise to errors wheter the last digit actually is a zero.