1

My project is going through a tech upgrade so we are upgrading Oracle DB from 11g to 12c. SAP Data Services is upgraded to version 14.2.7.1156.

The tables in Oracle 12C is defaulted to varchar (byte) when it should be varchar (char). I understand this is normal. So, I altered the session for each datastore running

`ALTER session SET nls_length_semantics=CHAR;`

When I create a new table, with varchar (1), I am able to load unicode characters like Chinese characters (i.e 东) into the new table from Oracle.

However, when I try to load the same unicode character via SAPDS into the same table, it throws me an error 'ORA-12899 - value too large for column'. My datastore settings are:

Locale
Language: default
Code Page: utf-8
Server code page: utf-8

Additional session parameters: 
ALTER session SET nls_length_semantics=CHAR

I would really appreciate to know what settings I need to change in my SAP BODS since my Oracle seems to be working fine.

Sandra Rossi
  • 11,934
  • 5
  • 22
  • 48
Shanaaz
  • 11
  • 2

1 Answers1

0

I think, you should consider modifying table column from varchar2(x BYTE) to varchar2(x CHAR) to allow Unicode (UTF-8 format) data and avoid ORA-12899 .

create table test1 (name varchar2(100));
insert into test1 values ('east');
insert into test1 values ('东');

alter table test1 modify name varchar2(100 char);

-- You can check 'char_used' for each column like -

select column_name, data_type, char_used  from user_tab_columns where table_name='TEST1';
pahariayogi
  • 1,073
  • 1
  • 7
  • 18