I have a table MACRecord in MYSQL. I have 3 fields declared as TINYINT(4)
(RSSI1, RSSI2, RSSI3
) with a default value of 0
. I use it to store signed negative values i-e -90, -84
etc. I am trying to understand how much length each number is taking in the column. I have a c++
code to output length of each column in bytes as follows:
lengths = mysql_fetch_lengths(result);
num_rows = mysql_num_rows(result);
for(i = 3; i < num_fields; i++)
{
if (strstr(fields[i].name, "RSSI") != NULL)
{
printf("Column %u is %lu bytes in length.\n",
i, lengths[i]);
}
So now for example if columns 3, 4 and 5 which are my RSSI's columns contain values such as -90, 0(default), -83, i get the output as follows:
Column 3 is 3 bytes in length
Column 4 is 1 bytes in length.
Column 5 is 3 bytes in length
I don't understand why having a value of for example -90
in my RSSI
's columns which were declared as TINYINT(4)
which can only store 1 byte
values is displaying as having 3 bytes
of data? The default value of 0
makes sense. If I describe my table I can see that the data type is TINYINT
but still it says values 3 bytes
in length.
| Field | Type | Null | Key | Default | Extra |
| RSSI1 | tinyint(4) | YES | | 0 | |
| RSSI2 | tinyint(4) | YES | | 0 | |
| RSSI3 | tinyint(4) | YES | | 0 | |
I am a bit confused. Any help will be appreciated.