0

I am doing below code in c#

obj.value = decimal_value / 100;

Where obj.value is a decimal variable in the model

decimal_value is a variable holding a decimal value

C# code

if (member["LOADINGS"] != "")
{
    decimal loading_temp = Convert.ToDecimal(member["LOADINGS"]);
    prem.loadings = loading_temp / 100m;
}

When debugged prem.loading gets correct value 0.0952 but when it get saved in sql server it shows 0.09000

loading variable in model

public decimal? loadings  {get;set;}

eg: result

9.52/100 gives 0.0952 but when it stores in sql server in a column of datatype decimal(18,5) it gives the result 0.0900

Any idea for this ?

Edit

saving in database

premium prem = new premium();
 if (member["LOADINGS"] != "")
    {
        decimal loading_temp = Convert.ToDecimal(member["LOADINGS"]);
        prem.loadings = loading_temp / 100m;
    }

db.premium.add(prem);
db.savechanges();
Sachu
  • 7,555
  • 7
  • 55
  • 94
  • can you show us how you store this in sql server ? Any code maybe ? – GuidoG Jun 05 '18 at 09:31
  • @GuidoG edited the question – Sachu Jun 06 '18 at 03:15
  • we need much more information. What type is prem.Loadings ? What is db ? what does db.premium.add do ? Remember we cannot see your code. What you should do is set a breakpoint on `prem.Loadings = loading_temp / 100=;` and keep stepping into until you either see the value changing or send it to your database and capture the exact command it sends – GuidoG Jun 06 '18 at 06:44

1 Answers1

0

In SQL Server:

declare @v decimal(18,5)

select @v = 9.52/100

select @v

returns 0.09520

MJH
  • 1,710
  • 1
  • 9
  • 19