0

I've got a rather simple query to a linked DB2 table.

SELECT GC_TBSELC.*
FROM GC_TBSELC
WHERE SELC_EFF_DATE > #1/1/2017#;

Works fine, returns results. However, when I add the "DISTINCT" keyword, I get an error:

ODBC -- CALL FAILED

[[IBM][CLI Driver][DB2] SQL0904N Unsuccessful execution caused by an unavailable resource. Reason code: "00C90305", type of resource: "00000100", and resource name: "DSNDB07". SQLSTATE=57011

Any idea on why the "DISTINCT" keyword would cause this, and if there's a way around it to get distinct records from the table?

mustaccio
  • 18,234
  • 16
  • 48
  • 57
Johnny Bones
  • 8,786
  • 7
  • 52
  • 117
  • 1
    Show the query that generates the error. – Gordon Linoff Aug 15 '19 at 16:16
  • All I did was add "DISTINCT" to the above query. SELECT DISTINCT GC_TBSELC.*... – Johnny Bones Aug 15 '19 at 16:17
  • 2
    Did you look up the [error code](https://www.ibm.com/support/knowledgecenter/en/SSEPEK_11.0.0/codes/src/tpc/n904.html)? Possibly that table is very large (or you have large fields like blobs) and `DISTINCT` is expensive for your resource limits. Plus, one should not be running `DISTINCT` with `*`. Isn't their a unique identifier in table? Try explicitly referencing columns. – Parfait Aug 15 '19 at 16:33

2 Answers2

2

SQL0904N with Reason code: 00C90305 indicates the following:

The limit on the space usage of the work file storage by an agent was exceeded. The space usage limit is determined by the zparm keyword MAXTEMPS.

By adding the DISTINCT clause on a SELECT * (all columns), you likely exceeded the work space available.

Let me ask a better question: Why would you want to DISTINCT all columns from a Table? Is this really the result set you are looking for? Would it be more appropriate to DISTINCT a subset of the columns in this table?

0

The query without the DISTINCT did not require duplicate removal - rows could just be streamed back to the caller.

The DISTINCT tells Db2 - remove duplicates before passing back the rows. In this case, Db2 likely materialized the rows into sort work and sorted to remove duplicates and during that process, sort work limits were exceeded.