Keep PostgreSQL from sometimes choosing a bad query plan

后端 未结 5 1748
梦毁少年i
梦毁少年i 2020-11-22 11:51

I have a strange problem with PostgreSQL performance for a query, using PostgreSQL 8.4.9. This query is selecting a set of points within a 3D volume, using a LEFT OUT

5条回答
  •  抹茶落季
    2020-11-22 12:22

    If the query planner makes bad decisions it's mostly one of two things:

    1. The statistics are inaccurate.

    Do you run ANALYZE enough? Also popular in it's combined form VACUUM ANALYZE. If autovacuum is on (which is the default in modern-day Postgres), ANALYZE is run automatically. But consider:

    • Are regular VACUUM ANALYZE still recommended under 9.1?

    (Top two answers still apply for Postgres 12.)

    If your table is big and data distribution is irregular, raising the default_statistics_target may help. Or rather, just set the statistics target for relevant columns (those in WHERE or JOIN clauses of your queries, basically):

    ALTER TABLE ... ALTER COLUMN ... SET STATISTICS 400;  -- calibrate number
    

    The target can be set in the range 0 to 10000;

    Run ANALYZE again after that (on relevant tables).

    2. The cost settings for planner estimates are off.

    Read the chapter Planner Cost Constants in the manual.

    Look at the chapters default_statistics_target and random_page_cost on this generally helpful PostgreSQL Wiki page.

    There are many other possible reasons, but these are the most common ones by far.

提交回复
热议问题