Node calling postgres function with temp tables causing “memory leak”

前端 未结 2 1454
别跟我提以往
别跟我提以往 2021-01-14 17:57

I have a node.js program calling a Postgres (Amazon RDS micro instance) function, get_jobs within a transaction, 18 times a second using the node-postgres

2条回答
  •  清歌不尽
    2021-01-14 18:53

    Use CTEs to create partial result sets instead of temp tables.

    CREATE OR REPLACE FUNCTION get_jobs (
    ) RETURNS TABLE (
      ...
    ) AS 
    $BODY$
    DECLARE 
      _nowstamp bigint; 
    BEGIN
    
      -- take the current unix server time in ms
      _nowstamp := (select extract(epoch from now()) * 1000)::bigint;  
    
      RETURN query (
    
        --  1. get the jobs that are due
        WITH jobs AS (
    
          select ...
          from really_big_table_1 
          where job_time < _nowstamp;
    
        --  2. get other stuff attached to those jobs
        ), jobs_extra AS (
    
          select ...
          from really_big_table_2 r
            inner join jobs j on r.id = j.some_id
    
        ) 
    
        -- 3. return the final result with a join to a third big table
        select je.id, ...
        from jobs_extra je
          left join really_big_table_3 r on je.id = r.id
        group by je.id
    
      );
    
    END
    $BODY$ LANGUAGE plpgsql VOLATILE;
    

    The planner will evaluate each block in sequence the way I wanted to achieve with temp tables.

    I know this doesn't directly solve the memory leak issue (I'm pretty sure there's something wrong with Postgres' implementation of them, at least the way they manifest on the RDS configuration).

    However, the query works, it is query planned the way I was intending and the memory usage is stable now after 3 days of running the job and my server doesn't crash.

    I didn't change the node code at all.

提交回复
热议问题