I have never worked with currency below in a database. I have done some research, which tells me that you should use the Decimal datatype. Please see the DDL below:
create table dbCurrency (id int not null identity, CurrencyValue decimal (4,2), primary key (id))
insert into dbCurrency (CurrencyValue) values (50);
insert into dbCurrency (CurrencyValue) values (20);
insert into dbCurrency (CurrencyValue) values (10);
insert into dbCurrency (CurrencyValue) values (5);
insert into dbCurrency (CurrencyValue) values (2);
insert into dbCurrency (CurrencyValue) values (1);
insert into dbCurrency (CurrencyValue) values (0.5);
insert into dbCurrency (CurrencyValue) values (0.2);
insert into dbCurrency (CurrencyValue) values (0.1);
insert into dbCurrency (CurrencyValue) values (0.05);
insert into dbCurrency (CurrencyValue) values (0.02);
insert into dbCurrency (CurrencyValue) values (0.01);
This stores all of the UK currencies. Is this the correct way to do it. The reason I ask is because someone I spoke to suggested the data type should be: decimal (5,2)
rather than decimal (4,2). He knew I was storing currencies up to £50.
Also, say I have two C# decimals:
decimal first;
decimal second;
decimal third;
say I wanted to divide the first decimal by the second decimal to produce a third decimal. Would the third decimal also be a c# decimal datatype? The first decimal is always larger than the second and the result could be a fraction. The first and second decimals are monetary values.