0

So, I am trying to modernize the code on my IBM i, and I am thinking about subfiles and printfiles.

Excluding native I/O opperations, I can think of three ways to get my data to populate using embedded SQL.

  1. Cursor Fetch method
  2. MultiOccurance data structure for the number of records on one page
  3. Much larger MultiOccurance data structure holding multiple pages of data.

What is the best practice method? Any opinions?

jmarkmurphy
  • 11,030
  • 31
  • 59
Jeff
  • 329
  • 3
  • 13

2 Answers2

2

I have never backed a subfile with an array, though I suppose it would be a useful technique if you were going to have more than 9999 records. Instead I have always just read into it from a cursor. In fact mixing SQL and procedures makes populating a subfile extremely easy. You can even use a multi-record fetch if it makes sense to you. Here is a simple example (single record fetch):

dcl-proc LoadSubfile;
  dcl-pi *n;
    keyFields       Type;
  end-pi;

  dcl-ds sflin        LikeRec(sfl: *Input) Inz;

  ClearSubfile();
  OpenCursor(keyFields);
  dow FetchCursor(record);
    eval-corr sflin = record;
    PopulateHidden(sflin);
    rrn += 1;
    write sfl sflin;
  enddo;
  CloseCursor();
  rrnMax = rrn;
end-proc;

There are some things here that are not defined, for instance FetchCursor() returns an Indicator = *On if a record is returned. and PopulateHidden() populates hidden fields in the subfile record. I use hidden in an editable subfile to hold original values for fields that can be changed. I define the subfile fields the same as the record fields so that I can do an eval-corr to get them into the IO data structure. I also will check the subfile rrn for overflow if I believe it is possible for there to be more than 9999 records in the database. Then I throw a subfile full message with directions to filter the record set.

Some other things You didn't ask about, but I will tell you since you asked about best practices. Unless there is a reason to avoid it, I use SFLCLR to clear the subfile, and I generally load the entire subfile in one shot unless I suspect that there will be thousands of records. Many of the old optimizations like SFLNXTCHG and loading a single page at a time were put in place because communications were slow. Not so much the twinax communications, but the ASCII workstation controller, or remote workstations which were often on the other side of a communications line much slower than twinax. This is no longer true. I tend to avoid those old hacks meant to preserve bandwidth because they only complicate the code. Write the whole thing at once, and when processing, read through the whole subfile with a for loop.

Also, in the future, these questions should be on Code Review if you have working code that you want to know about best practices for, or maybe Software Engineering if you are more interested in theoretical answers. Stack Overflow is more for answering specific objective questions.

jmarkmurphy
  • 11,030
  • 31
  • 59
1

Possibly off-topic as "primarily opinion based"

But DS array .... the larger the better performing...

Charles
  • 21,637
  • 1
  • 20
  • 44
  • So you would make a DS(1000) maybe....fill it up and then handle paging and such in side the calling function? I guess that makes sense if you know there are going to be a whole lot of records, but what if sometimes you will return 999 records but most of the time you will only get around 100. Does the system reserve the memory for a huge DS every time it is executed? – Jeff Jul 02 '18 at 18:42
  • yes, the entire space is reserved when the program starts...I'd go with the "most of the time" value of 100...or bump it to 150. – Charles Jul 02 '18 at 18:47
  • So you are actually saying option 2....where I read a select number of records, and then reread for a second set, third set, so on as needed. I appreciate the opinion! I din't want to go changing a bunch of programs only to find out that there was a best practices issue! :) – Jeff Jul 02 '18 at 18:55
  • Except skip the multi-occurrance data structure, and use an array instead. MODS is very old school, and much more limited. – jmarkmurphy Jul 03 '18 at 11:03
  • @jmarkmurphy is correct, I mentioned DS array, but should have called out the difference between a DS array and a MODS. – Charles Jul 03 '18 at 13:09