I want to understand how numbers (doubles) are represented in bash and what happens when I printf numbers in hexadecimal format in bash.
According to the IEEE 754 standard double should be represented by 64 bits: 52 bits (13 hex numbers) for a significand, 11 bits for the exponent and 1 bit for sign.
To check it, I wrote a simple C program that transforms hex into dec (using printf).
include <stdio.h>
int main(int argc, char **argv)
{
printf("hex read = %40.24a\n", 0x1.000010C6F7A0B5E1Fp+0);
}
compiling with gcc 4.2.1, I get
hex read = 0x1.000010c6f7a0b00000000000p+0
From this result I conclude, that as I expect significand is defined by 13 hex digits 000010c6f7a0b.
Now I turn to bash and use the following script:
#!/bin/bash
echo "hex read = 0x"$1
printf "hex =%80.70a\n" "0x"$1
printf "hex -> dec=%80.70f\n" `echo "0x"$1`
GNU bash 3.2.48
$ bash hex2dec 1.000010C6F7A0B5E1F
hex read = 0x1.000010C6F7A0B5E1F
hex = 0x1.000010c6f7a0b000000000000000000000000000000000000000000000000000000000p+0
hex -> dec= 1.0000009999999999177333620536956004798412322998046875000000000000000000
So everything worked as I expected: 13 hex digits define the significand of the number.
GNU bash 4.1.5
$ bash hex2dec 1.000010C6F7A0B5E1F
hex read = 0x1.000010C6F7A0B5E1F
hex = 0x8.00008637bd05af10000000000000000000000000000000000000000000000000000000p-3
hex -> dec= 1.0000009999999999993737856418540843606024282053112983703613281250000000
This is not what I expected!
Question 1 Why in GNU bash 4.1.5 double's significand is represented by 16 hex digits (instead of 13 according to IEEE 754)?
Question 2 Why printf "%a" represents the hex number in a different format in different bash versions (bash 3.2.48 0x1.hh...hp+d and bash 4.1.5 0xh.hh...hp+d?). Shouldn't printf follow the same standard in both bash versions and be reglamented by http://pubs.opengroup.org/onlinepubs/009695399/functions/fprintf.html?