0

I have written a code to get all records into internal table using select query, perform concatenation on it and update the DB from this internal table.

To improve performance how can I select specific number of records like 1000 records, process them and after updating the DB select next 1000 records and so on.

Here is my code

DATA: lv_index_table TYPE /ngv/b0tabnm,
      lref_tname     TYPE REF TO data.
FIELD-SYMBOLS: <lt_indx_table>  TYPE ANY TABLE,
               <lwa_indx_table> TYPE any.

    CALL METHOD /ngv/cl_bmtr_framework=>generated_table_get
      EXPORTING
        i_dtarea        = /ngv/if_bcon=>mc_data_area-index
      IMPORTING
        e_table         = lv_index_table      "here i get DB table name .
      EXCEPTIONS
        table_not_found = 1
        OTHERS          = 2.

    CHECK lv_index_table IS NOT INITIAL.

    CREATE DATA lref_tname TYPE TABLE OF (lv_index_table).
    ASSIGN lref_tname->* TO <lt_indx_table>.

    CHECK <lt_indx_table> IS ASSIGNED.
    SELECT * FROM (lv_index_table) INTO TABLE <lt_indx_table> UP TO 1000 ROWS.

 LOOP AT <lt_indx_table> ASSIGNING <lwa_indx_table>.
  ASSIGN COMPONENT 'EXTID' OF STRUCTURE <lwa_indx_table> TO FIELD-SYMBOL(<lv_extid>).
  ASSIGN COMPONENT 'MTRCT' OF STRUCTURE <lwa_indx_table> TO FIELD-SYMBOL(<lv_mtrct>).
  ASSIGN COMPONENT 'MSKVL' OF STRUCTURE <lwa_indx_table> TO FIELD-SYMBOL(<lv_mskvl>).

  CHECK <lv_extid> IS ASSIGNED AND <lv_mtrct> IS ASSIGNED AND <lv_mskvl> IS ASSIGNED.
  CONCATENATE <lv_extid> '_' <lv_mtrct>  INTO <lv_mskvl>.
ENDLOOP.
MODIFY (lv_index_table) FROM table <lt_indx_table>.
Suncatcher
  • 10,355
  • 10
  • 52
  • 90
sumedh patil
  • 59
  • 3
  • 8
  • Isn't `SELECT UP TO. ENDSELECT.` what you're looking for? Alternatively you can use `OPEN CURSOR ... UP TO`. You have to be then also careful when making `COMMIT` after each package, the cursor is then invalidated. – Jagger May 26 '22 at 13:12
  • actually ,i am writing code in abap and here lv_index_table db table is having lot of data . so using select query i want to get 1000 records from the table in local table lt_indx_table first and after performing concatenation operations and updating it in db table , i want to get next 1000 records and perform same operations . likewise it should continue. – sumedh patil May 26 '22 at 13:24
  • `SELECT ... PACKAGE SIZE 1000 ... ENDSELECT.` or `OPEN CURSOR ...` and `FETCH ... PACKAGE SIZE 1000 ...` – Sandra Rossi May 27 '22 at 06:39
  • unlikely you'll ever gain any performance by opening DB connection every 1000 records, it will be the opposite. Also look at this https://stackoverflow.com/questions/49898103/shortest-notation-to-split-abap-internal-table-into-smaller-pieces – Suncatcher May 31 '22 at 07:12

1 Answers1

1

There are different solutions for your requirement. It depends your number of records, your system, database etc.

If you have really huge data try cursors.

You can process data in packages. That's not increase performance pretty much but it helps prevention of performance deterioration.

For detailed information check this links (1 , 2, 3).

DATA s_cursor TYPE cursor.
DATA:it_mara TYPE TABLE OF mara.
OPEN CURSOR WITH HOLD s_cursor FOR
SELECT * FROM mara.
DO.
  FETCH NEXT CURSOR s_cursor APPENDING TABLE it_mara PACKAGE SIZE 10.
  IF sy-subrc <> 0.
    EXIT.
  ENDIF.
"Your logic should be here..
 
"If you want to update database use CALL FUNCTION 'DB_COMMIT'
ENDDO.
CLOSE CURSOR s_cursor.
Eray
  • 157
  • 1
  • 8