I've written a very simple database access layer that maps C# objects onto database tables and back. It uses the information in all_tab_columns in order to do some input validation (mainly length checking). It works ok on ISO-encoded databases but on UTF-8 it produces wrong results. I tried (data_length / LENGTHB('รค')) which sometimes seems to work and sometimes doesn't. I'm aware it's a dirty hack, but I haven't found a pattern yet. Is there a reliable way to calculate the CHAR length of a VARCHAR2 field from data_length?
Oracle-DB: Reliable way to calculate the length of a field in CHAR from all_tab_columns.data_length?
Asked
Active
Viewed 188 times
0
-
`DECODE( (SELECT COUNT(*) FROM nls_database_parameters WHERE UPPER(parameter) = 'NLS_CHARACTERSET' AND UPPER(value) LIKE '%UTF8%'), 1, (data_length / 2), data_length)` Does what I need but it isn't generic. It wouldn't work on a UTF-16 database. โ Christopher Thonfeld-Guckes Oct 19 '18 at 08:26
-
Don't UTF-8 characters vary in size from 1-4 bytes? That seems impossible to infer without looking at the actual data in the column, unless you just want an estimated range of character lengths. โ kfinity Oct 19 '18 at 14:10
-
They do. Actually I'm trying to get the size that was provided when the table was created. E.g. if the column is defined as `col1 VARCHAR2(64 CHAR)` I'd like to get the 64. ALL_TAB_COLUMNS gives me 128 though. โ Christopher Thonfeld-Guckes Oct 19 '18 at 14:29
1 Answers
0
I found the answer on my own. ALL_TAB_COLUMNS provides a field CHAR_LENGTH that contains the maximum amount of characters in the column. Example:
SELECT column_name, char_length FROM all_tab_columns WHERE table_name = 'SOME_TABLE';

Christopher Thonfeld-Guckes
- 306
- 5
- 16