We are doing an Oracle Database Unicode conversion from WE8ISO8859P15 single Byte characterset to AL32UTF8 is Multibyte Characterset. It seems our software vendor mixed VARCHAR2(CHAR and Byte) in the data model. After the datamodel has been moved to a Unicode Oracle database (datapump metadata only), the migration tool uses the following query to decide if it should go from varchar2 to CLOB expecting varchar2(byte).
select 'ALTER TABLE '||table_name||' MODIFY '||column_name||' LONG;'||chr(13)||chr(10)||'ALTER TABLE '||table_name||' MODIFY '||column_name||' CLOB;' from user_tab_columns where (data_type='CHAR' or data_type='VARCHAR2') and data_length>2000 and table_name in (select table_name from user_tables);
The problem is that varchar2 (char) has been increased automatically by x4 by the Oracle Database reaching the limit of 4000 bytes in some cases. The datalength of 2000 seem to be odd. Is a datalength of 2000 a good limit to go to clob? The Oracle Database parameter MAX_STRING_SIZE can be used to increase the limit of 4000 bytes for varchar2 but this could have other side effects.
How should varchar2 (char) be handled extending the query above?