0

I am using file layout method and using Peoplesoft suggested code. It works fine with a small csv file but I need to import a file that is 921,000kb. It took a day and is still running.

I cannot use an external tool because this is supposed to be a batch process in a daily schedule. I am asking if there is another way that I can code it inside an app engine program. Please advise. Thanks for your help.

SForum
  • 3
  • 3

1 Answers1

0

Well, i have never worked with a file so heavy, but have put some code here that may help you, is a class that use a Java BufferedReader to read a CSV file line by line and parse them into an object (I do not know much about java, but i think a BufferedReader should be more efficient when reading large files), then you can access each member of the line using their index or the excel column names.

This is an example:

import PPL_UTILITIES:CSV:Reader;
import PPL_UTILITIES:CSV:Line;
    
Local string &myCSVPath = "/an/absolute/filepath/myCSV.csv";
Local PPL_UTILITIES:CSV:Line &oLine; /* Reusable Line Object */
    
Local PPL_UTILITIES:CSV:Reader &CSVReader = create PPL_UTILITIES:CSV:Reader(&myCSVPath, "UTF-8");
Local record &oRec = CreateRecord(Record.MYRECORD);

StartWork();
try
   While &CSVReader.ReadLine(&oLine)
      &oRec.MYFIELD1.value = &oLine.GetField("A");
      &oRec.MYFIELD2.value = &oLine.GetField("B");
      &oRec.MYFIELD3.value = &oLine.GetField("C");
      &oRec.MYFIELD4.value = &oLine.GetField("AB");
      &oRec.Insert();
   End-While;
catch Exception &ex
   WriteToLog(%ApplicationLogFence_Error, &ex.ToString() | Char(10) | &ex.Context);
end-try;
&CSVReader.Close();
CommitWork();

The downside is that all items are treated as strings yet, so you would have to do the conversions.

  • I also think this can be improved using an SQL object and enabling bulk insert. But i have not tested it, so I cannot say for sure. – Guillermo-Santos Jul 01 '23 at 01:16