134

Is there any difference between DECIMAL and NUMERIC data types in SQL Server?

When should I use DECIMAL and when NUMERIC?

Dhanapal
  • 14,239
  • 35
  • 115
  • 142

6 Answers6

110

They are the same. Numeric is functionally equivalent to decimal.

MSDN: decimal and numeric

Marjan Venema
  • 19,136
  • 6
  • 65
  • 79
Guffa
  • 687,336
  • 108
  • 737
  • 1,005
  • 5
    Functional equivalence is not the same as equality. Actually, in the instructor's slides of the MS6232A course there's a comment added that they are ALMOST the same. At the same time, no recommendation is made by Microsoft regarding one or another (however DECIMAL makes more sense, as it's a standard's datatype instead of a legacy datatype from Sybase). Still wondering what the real difference (behind the screens) is :-). – vstrien Jul 13 '11 at 12:41
  • 25
    @vstrien: The only difference that I can find is that in the SQL-92 standard `decimal` is *exactly* as precise as declared, while `numeric` is *at least* as precise as declared. In SQL Server both are exactly as precise as declared, i.e. it doesn't use the flexibility for `numeric` that the standard allows. – Guffa Jul 13 '11 at 12:57
  • 18
    NOTE, however, that SQL Server does not treat them as interchangeable: if, for example, you have a "parent" column in "DECIMAL(18,0)” format, and you try to add a **foreign key** for it that references a column in "NUMERIC(18,0)" format, you will get the error `Column '' is not the same data type as referencing column '.' in foreign key ''`. They have to both be NUMERIC(x,y), or both be DECIMAL(x,y). – Doug_Ivison Jan 02 '14 at 00:30
  • 9
    @Guffa: your answer contradicts the SQL2003 standard as quoted at http://stackoverflow.com/a/759606/14731. Specifically, `decimal` is *at least* as precise as declared, whereas `numeric` is *exactly* as precise as declared. – Gili Feb 15 '14 at 16:17
  • @Gili: See my answer to vstrien above. The SQL standard has that distinction, but SQL Server implements them the same. – Guffa Feb 15 '14 at 19:23
  • @Guffa: I get that. My point was that in your explanation you reversed which type is *at least* as precise according to the specification. – Gili Feb 16 '14 at 22:50
  • 5
    @Gili: Yes, on review of the original material it seems that you are right that I did swap them around. – Guffa Feb 16 '14 at 23:34
  • 1
    @Doug_Ivison True. Additionally: "For the decimal and numeric data types, SQL Server considers each specific combination of precision and scale as a different data type. For example, decimal(5,5) and decimal(5,0) are considered different data types." Source: https://msdn.microsoft.com/en-us/library/ms187746.aspx – Muhamed Huseinbašić Dec 20 '15 at 15:04
42

This is what then SQL2003 standard (§6.1 Data Types) says about the two:

 <exact numeric type> ::=
    NUMERIC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DECIMAL [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | DEC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ]
  | SMALLINT
  | INTEGER
  | INT
  | BIGINT

 ...

21) NUMERIC specifies the data type
    exact numeric, with the decimal
    precision and scale specified by the
    <precision> and <scale>.

22) DECIMAL specifies the data type
    exact numeric, with the decimal scale
    specified by the <scale> and the
    implementation-defined decimal
    precision equal to or greater than the
    value of the specified <precision>.
Joakim Backman
  • 1,875
  • 13
  • 12
11

To my knowledge there is no difference between NUMERIC and DECIMAL data types. They are synonymous to each other and either one can be used. DECIMAL and NUMERIC data types are numeric data types with fixed precision and scale.

Edit:

Speaking to a few collegues maybe its has something to do with DECIMAL being the ANSI SQL standard and NUMERIC being one Mircosoft prefers as its more commonly found in programming languages. ...Maybe ;)

kevchadders
  • 8,335
  • 4
  • 42
  • 61
2

Joakim Backman's answer is specific, but this may bring additional clarity to it.

There is a minor difference. As per SQL For Dummies, 8th Edition (2013):

The DECIMAL data type is similar to NUMERIC. ... The difference is that your implementation may specify a precision greater than what you specify — if so, the implementation uses the greater precision. If you do not specify precision or scale, the implementation uses default values, as it does with the NUMERIC type.

It seems that the difference on some implementations of SQL is in data integrity. DECIMAL allows overflow from what is defined based on some system defaults, where as NUMERIC does not.

Alex Firsov
  • 168
  • 11
1

They are synonyms, no difference at all.Decimal and Numeric data types are numeric data types with fixed precision and scale.

-- Initialize a variable, give it a data type and an initial value

declare @myvar as decimal(18,8) or numeric(18,8)----- 9 bytes needed

-- Increse that the vaue by 1

set @myvar = 123456.7

--Retrieve that value

select @myvar as myVariable
John Smith
  • 7,243
  • 6
  • 49
  • 61
Riya
  • 11
  • 2
-2

They are exactly the same. When you use it be consistent. Use one of them in your database

Faruk
  • 9
  • 2