I know that N prefix means Unicode, and regular character takes one byte to store and Unicode character takes two bytes (sometimes four bytes if surrogate pair is needed) to store
I saw some SQL book uses N for ASCII characters such as:
SELECT empid, firstname, lastname
FROM HR.Employees
WHERE lastname = N'davis';
but what if lastname is a column is an ASCII column, does using N prefix causes the value from lastname automatically to be converted to Unicode so that it can be compared with N'davis'? because it is more sense to me that we need to use N prefix on both side such as:
SELECT empid, firstname, lastname
FROM HR.Employees
WHERE N`lastname = N'davis'; -- sth like this, or it could be a system function like WHERE N(lastname) = N'davis';
And what happen if users (who don't know whether the column is ASCII or Unicode column) want to update the table:
UPDATE HR.Employees
SET lastname = N'davis'
WHERE empid= 2022
and if the lastname column only supports ASCII?