2

I have a stored procedure where I use a cursor to loop through items in a temporary table:

OPEN CURSOR_SCORE_ITEMS FOR SELECT 
        ID_X, ID_Y
    FROM
        SCORE_ITEMS
    GROUP BY 
        ID_X, ID_Y
    HAVING 
        SUM(SCORE) > 10;   
    LOOP
    FETCH CURSOR_SCORE_ITEMS BULK COLLECT INTO COMPARE_ITEMS LIMIT 100;

    ---loop over items and do stuff---

    END LOOP;
CLOSE CURSOR_SCORE_ITEMS;

The procedure is working fine for instances where the 'SCORE_ITEMS' table is small, but for large tables (several millions of rows) I am receiving error

"ORA-01652: Temp-Segment kann nicht um 12800 in Tablespace TEMP_ALL erweitert werden"

(sorry, its in German).

Note that SCORE_ITEMS is a temporary table which is generated earlier in the procedure. It seems that the cursor query is exceeding the size of the temp tablespace.

I read some solutions already that involve increasing the size of the tablespace but I do not have any privileges on this database so I do not think that is possible. Is there an alternative way, or some kind of preprocessing I might consider, that reduce the overhead in the temp tablespace?

APC
  • 144,005
  • 19
  • 170
  • 281
Dirk. K.
  • 21
  • 2
  • 1
    you temp tablespace is to small for a Transaction, you should commit after x rows or increase a size of you tem tablespace – hotfix Apr 08 '19 at 08:46
  • What is the stuff you're doing inside the loop? Is there any chance you could encapsulate the logic of the cursor and loop into a single DML statement? Or, at the very least, move most of the calculation into the opening cursor? Also, why are you using a ref cursor, rather than a "normal" (i.e. explicit or implicit) cursor? – Boneist Apr 08 '19 at 09:08

1 Answers1

1

Global Temporary Tables are written to TEMPORARY tablespace (that is, not the usual tablespace for heap tables). Do you have a separate temporary tablespace for GTTs? I suspect not. Most places don't.

So (assuming No), when SCORE_ITEMS has millions of rows you've already eaten a big chunk of TEMP. Then your query kicks off with an aggregation that is big enough to spill into TEMP - because GROUP BY needs sorting.

You have already ruled out the obvious solution:

increasing the size of the tablespace but I do not have any privileges on this database so I do not think that is possible.

I don't know whether this also rules out the radical idea of talking to your DBA and seeing whether they will increase the space allocated to TEMP, or - better - create a new tablespace for Global Temporary Tables.

The other thing to consider is whether you actually need TEMP_SCORE. It's not unusual for people to populate a GTT when they could just write a more efficient SELECT instead. There's a lot of overhead in a GTT - all that I/O to disk, not to mention contending for shared TEMP tablespace. It's definitely an option to consider.

APC
  • 144,005
  • 19
  • 170
  • 281
  • Thanks, this is very helpful. So if the SCORE_ITEMS table were to be a permanent table rather than a temporary table, this would mean it would not be allocated to TEMP which might free up enough space for the GROUP BY aggregation? That would be another possibility. – Dirk. K. Apr 08 '19 at 09:13
  • 1
    You would still have the overhead of writing to and reading from disk, which is obviously greater when handling *"several millions of rows"*. Plus you would need to handle the housekeeping (clear down) explicitly. Finally, you would need to handle session isolation, if you may have more than one user running this process at any given time. – APC Apr 08 '19 at 09:17
  • For any future reference to this - lets state it clearly that it is never the right answer to be scared of the DBA! If using temp table makes sense for this problem and avoids housekeeping headaches - dont mess up the whole application "because DBA is such a holy person that we cannot talk to"!. Never mess up code because of politics! – Saad Ahmad Apr 08 '19 at 20:44