Aside from wrapping my literal in a CONVERT function, is there a way to specify that I want e.g. 12345 represented as a BIGINT and not an INT? In C#, I could specify 12345L, but I'm unaware of equivalent functionality in T-SQL.
Asked
Active
Viewed 1.5k times
16
-
You can CAST it. Other than the N'value' wrapping you can use for unicode, I'm pretty sure there are not wraps like that for numerical values. – dfundako Aug 19 '16 at 13:51
-
@dfundako see wraps for other types such as float and money [here](https://msdn.microsoft.com/en-us/library/ms179899.aspx). – Tim Lehner Aug 19 '16 at 15:46
-
Out of curiosity, in what context are you concerned about a string literal being interpreted as an int instead of a bigint? From a sarability standpoint, if you try to do something like `where [bigIntColumn] = 123`, it will implicitly convert the string literal 123 into a bigint and then do the comparison. – Ben Thul Aug 19 '16 at 17:31
-
I just ran into integer division: `select 21859000000 / 1000000000` "should" yield `21` but it's `21.859` unless you cast both as bigint or declare them as bigint variables – gknicker Sep 01 '22 at 23:37
4 Answers
13
You have to explicitly declare or cast to a bigint.
While there are prefixes and symbols for some other datatypes (binary, float, money, etc.), I don't think there is a way to do this in T-SQL for bigint that doesn't involve either explicitly declaring the bigint or casting/converting to it.
In fact, at least for a select...into
operation, SQL Server will use a numeric (decimal) datatype once your integer literals go beyond what can be stored in an int.
select 2000000000 as col into test;
select * from information_schema.columns where table_name = 'test';
-- DATA_TYPE: int
drop table test;
select 3000000000 as col into test;
select * from information_schema.columns where table_name = 'test';
-- DATA_TYPE: numeric
drop table test;
select cast(3000000000 as bigint) as col into test;
select * from information_schema.columns where table_name = 'test';
-- DATA_TYPE: bigint
drop table test;
declare @col bigint = 3000000000;
select @col as col into test;
select * from information_schema.columns where table_name = 'test';
-- DATA_TYPE: bigint
drop table test;

Tim Lehner
- 14,813
- 4
- 59
- 76
7
select cast(1 as bigint)
IOW you simply cast your value. What would be the purpose?

Cetin Basoz
- 22,495
- 3
- 31
- 39
-
3I can think of a dozen use cases. But, here's a simple one: SELECT POWER(256,5) will fail unless I explicitly CONVERT (or CAST per your example) 256 to a bigint as in SELECT POWER(CONVERT(bigint, 256), 5). – Jason Aug 19 '16 at 16:16
-
I see, T-SQL parser is not smart enough. One way would be to do as: select power(256.0,5); – Cetin Basoz Aug 19 '16 at 19:51
-
Casting will make the column nullable, which might be bad in some cases. – Preza8 May 16 '19 at 11:17
-
-
@CetinBasoz Like when it is supposed to become a primary key. As far as I am aware, there is not satisfactory answer to this solution. The workaround I went with was altering the column later. But that is also only really possible if there is no data in the table, so... – Preza8 May 16 '19 at 13:04
-
@Preza8, then you need to check for uniqueness in the first place. Being null doesn't prevent being a primary key. If your primary key column wouldn't accept nulls then you could simply add a criteria as "... IS NOT NULL". – Cetin Basoz May 16 '19 at 13:10
-
That was not my point. The data do not containt any null values but the type system and the available functions do not make it possible in that case to set the type of such column to not null. I got error 8111. Maybe it is because of CLUSTERED? I don't know and I already chose a different way. – Preza8 May 16 '19 at 15:00
0
You can use decimal or hex literal like:
declare @variable bigint
set @variable = 0x7FFFFFFFFFFFFFFF
select @variable
set @variable = 9223372036854775807
select @variable

Andrey Klochkov
- 168
- 1
- 10