0

Should I define a column type from actual length to nth power of 2?

The first case, I have a table column store no more than 7 charactors, will I use NVARCHAR(8)? since there maybe implicit convert inside Sql server, allocate 8 space and truncate automatic(heard some where).

If not, NCHAR(7)/NCHAR(8), which should be(assume the fixed length is 7)

Any performance differ on about this 2 cases?

Lei Chi
  • 216
  • 1
  • 14

1 Answers1

1

You should use the actual length of the string. Now, if you know that the value will always be exactly 7 characters, then use CHAR(7) rather than VARCHAR(7).

The reason you see powers-of-2 is for columns that have an indeterminate length -- a name or description that may not be fixed. In most databases, you need to put in some maximum length for the varchar(). For historical reasons, powers-of-2 get used for such things, because of the binary nature of the underlying CPUs.

Although I almost always use powers-of-2 in these situations, I can think of no real performance differences. There is one. . . in some databases the actual length of a varchar(255) is stored using 1 byte whereas a varchar(256) uses 2 bytes. That is a pretty minor difference -- even when multiplied over millions of rows.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786