3

Based on this (https://stackoverflow.com/a/13954487/1543618) answer to a question regarding automated VACUUM FULL of bloated tables, a DBA (who doesn't work here anymore) developed a script to determine which tables to ANALYZE.

Note: this is v8.4 (yes, I know it's ancient, but there's zero I can do about it) and has a minimal maintenance window, so VACUUM FULL seems to not be a solution.

Is this a valid method for determining which tables require prophylactic analysis? I have my doubts, since the same tables show up in the query day after day.

Thanks

RonJohn
  • 349
  • 8
  • 20

1 Answers1

1

Assuming that you mean the script is the same check-if-there-is-bloat script that you reference, then no - "garbage" tuples are not taken into account for query planning, meaning you can re-analyze as much as you like, it will do you no good.

ANALYZE deals with estimating the number of records various SQL operators will yield from a given table, and how much space those records will take up in RAM. That information is then used to compare the cost (in time and memory) between various query plans that solve the same logical query.

The fact that there is garbage tuples in the table does not affect the number of tuples returned by a query, nor does it affect the size of in-use tuples. Hence, re-analyzing tables because they are filled with garbage makes no sense.

Details on ANALYZE from here: https://wiki.postgresql.org/wiki/Introduction_to_VACUUM,_ANALYZE,_EXPLAIN,_and_COUNT

Jacob Davis-Hansson
  • 2,603
  • 20
  • 26