3

I am fetching following details of columns from SQL Server:

Size
Precision
Scale

But I have noticed that in case of int I am getting 10 as precision but when i did some research, I couldn't find any such thing related to int, and that int datatype have precision of 10 or in SQL Server Management Studio.

So I don't know how 10 is coming as precision in case of int data type.

Table:

enter image description here

Screenshot:

Output

Code:

String[] columnRestrictions = new String[4];
columnRestrictions[0] = 'MyDb';
columnRestrictions[1] = 'dbo';
columnRestrictions[2] = 'Employee';

using (SqlConnection con = new SqlConnection("MyConnectionString"))
{
    con.Open();
    var columns = con.GetSchema("Columns", columnRestrictions).AsEnumerable()
         .Select
          (
                 t => new 
                 {
                     Name = t[3].ToString(),
                     Datatype = t.Field<string>("DATA_TYPE"),
                     IsNullable = t.Field<string>("is_nullable"),
                     Size = t.Field<Int32?>("character_maximum_length"),
                     NumericPrecision = t.Field<int?>("NUMERIC_PRECISION"),
                     NumericScale = t.Field<Int32?>("NUMERIC_SCALE")
                 }).ToList();
halfer
  • 19,824
  • 17
  • 99
  • 186
I Love Stackoverflow
  • 6,738
  • 20
  • 97
  • 216
  • 3
    If you look at the [int datatype](https://msdn.microsoft.com/en-GB/library/ms187745.aspx) you'll see that the largest values it can store require (up to) 10 decimal digits to represent. So if something's insisting on giving a precision for an `int`, it's not an unreasonable answer to provide. – Damien_The_Unbeliever Dec 12 '16 at 14:14
  • 3
    [*Precision is the number of digits in a number*](https://msdn.microsoft.com/en-GB/library/ms190476.aspx), the maximum digits in a 32 bit int is 10. (and 19 for a bigint) – Alex K. Dec 12 '16 at 14:15
  • 2
    http://stackoverflow.com/a/28836543/284240 – Tim Schmelter Dec 12 '16 at 14:16
  • @Damien_The_Unbeliever :So my output is correct as 10 in case of int datatype??? – I Love Stackoverflow Dec 12 '16 at 14:18
  • @TimSchmelter:Sir i would like to ask you 1 thing that why you deleted your answer on my this question(http://stackoverflow.com/questions/41062636/get-columns-size-along-with-column-name-and-datatype) although it was correct.i was about to mark it as accepted answer but you deleted it – I Love Stackoverflow Dec 12 '16 at 14:22
  • @Learning: [this](http://stackoverflow.com/a/41100028/284240)? Well, i had no time to investigate further and thought it wouldnt be helpful – Tim Schmelter Dec 12 '16 at 14:25
  • @TimSchmelter:Yes it was helpfull because you have already got an upvote and now you have my vote too and previously you have also deleted 1 answer on my question which was also correct and i was about to upvote and mark it as accepted but you deleted it before that so you lost my valuable vote.Hehe :) – I Love Stackoverflow Dec 12 '16 at 14:27
  • @damien_the_unbeliever:you guys are awesome in answer and in comments too.thank you so much sir for providing answer in comment.thanks once again – I Love Stackoverflow Dec 12 '16 at 17:21
  • @alex k: Thank you so much alex sir.you guys are really awesome because you guys can help anyone in your comments too.Thanks once again :) – I Love Stackoverflow Dec 12 '16 at 17:22

1 Answers1

1

Precision refers to the number of significant decimal digits a number can represent.

The int datatype, like the int datatype in C#, can range from -2147483648 to 2147483647, so it can have up to 10 significant decimal digits.

The precision of int is therefore always 10.

Jon Hanna
  • 110,372
  • 10
  • 146
  • 251