When I execute the following in order to load data to my Oracle Cloud Autonomous DB from Oracle Cloud bucket object (a CSV file):
DBMS_CLOUD.COPY_DATA(
table_name =>'test_landing_zone',
credential_name =>'test2',
file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/idqt8fo79xkv/b/test-upload/o/2025.csv',
format => json_object('type' value 'csv', 'ignoremissingcolumns' value 'true', 'trimspaces' value 'lrtrim', 'ignoreblanklines' value 'true', 'delimiter' value ';')
);
for a small file (100s of rows), all goes well. Once I try with 1000s of rows, I get:
ORA-20000: ORA-30094: failed to find the time zone data file for version 39 in $ORACLE_HOME/oracore/zoneinfo
ORA-30094: failed to find the time zone data file for version 39 in $ORACLE_HOME/oracore/zoneinfo
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 1563
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 7896
ORA-06512: at "C##CLOUD$SERVICE.DBMS_CLOUD", line 7919
ORA-06512: at line 2
Error at Line: 7 Column: 0
The larger file is literally the content of the smaller one, copied many times (to prevent this issue being caused by something in data).
Googling recommended to patch some libraries, but as this is DBaaS (Oracle Cloud Autonomous DB), that is out of the picture.
I will be grateful for any help. Thank you!