1

Is there a way to create a column which has an "universal" datatype? What I want is a column which can have a different datatype in every row.

For each insert, the idea would be to set the datatype (nvarchar, int, bigint, float, decimal), and then insert the value.

askolotl
  • 964
  • 1
  • 13
  • 27
  • That sounds like you want to something wrong. Why do you need it? – juergen d Mar 26 '20 at 10:41
  • 5
    The short answer is "no". The longer answer is `sql_variant` does exist, but *don't* use it; it comes with a huge bunch of caveats and at the end of the day you still need to be casting the results stored in that column to the correct data type. Considering the vagueness of this, and what you say you're after, this sounds like an [XY problem](http://xyproblem.info). – Thom A Mar 26 '20 at 10:41
  • I want to create a stored procedure which inserts a new row into ANY table by creating dynamic sql. Therefore, the client needs to pass the fields to be inserted somehow. So far, it seems that a table-valued parameter would be good. – askolotl Mar 26 '20 at 10:55

2 Answers2

1

I have seen Json data type used to achieve a variable data type for a column in Sql, which then gives you the flexibility that noSql usually provides. You can have the column of type Json and then put in {"value": myBigInt} or {"value": "myVarChar"}.

1

You can use sql_variant data type to store values of various SQL Server-supported data types.

CREATE   TABLE tableA(colA sql_variant, colB int)  
INSERT INTO tableA values ( cast (46279.1 as decimal(8,2)), 1689)  
SELECT   SQL_VARIANT_PROPERTY(colA,'BaseType') AS 'Base Type',  
         SQL_VARIANT_PROPERTY(colA,'Precision') AS 'Precision',  
         SQL_VARIANT_PROPERTY(colA,'Scale') AS 'Scale'  
FROM      tableA  
WHERE      colB = 1689 
Alberto Morillo
  • 13,893
  • 2
  • 24
  • 30