0

I am trying to read a file which is sent by an external system to our server and load it into a oracle database table. I am using the utl_file package to read the data from the file. When I am creating the file with similar contents in the unix environment it is getting it read by my oracle code but the same is not working on the file we are receiving from the external system. On researching I found that the file sent by external system is of type ISO-8859 text, with CRLF line terminators whereas the one I have created in the unix environment is of type UTF-8 Unicode text.

Below is the output of

select * from sys.v_$parameter
where name like 'nls_lang%'

enter image description here

James Z
  • 12,209
  • 10
  • 24
  • 44

1 Answers1

0

Because Unix-like use only 1 (one) character for END-OF-LINE (LINE-FEED with ascii code 10), and the file has 2 (two) characters for END-OF-LINE (CARRIAGE_RETURN with ascii code 13 and next LINE-FEED with ascii code 10), every line read by UTL_FILE.GET_LINE has as last character a CARRIAGE-RETURN character (ascii 13).
You need to RTRIM this last character.
For example, every line is read on "sbLine" variable, then use this to remove the last

rtrim(sbLine,chr(13))
alvalongo
  • 523
  • 3
  • 11