How Do I Deep Copy a Set of Data, and Change FK References to Point to All the Copies?

后端 未结 2 1213
抹茶落季
抹茶落季 2020-12-17 17:07

Suppose I have Table A and Table B. Table B references Table A. I want to deep copy a set of rows in Table A and Table B. I want all of the new Table B rows to reference the

2条回答
  •  被撕碎了的回忆
    2020-12-17 18:09

    I recently found myself needing to solve a similar problem; that is, I needed to copy a set of rows in a table (Table A) as well as all of the rows in related tables which have foreign keys pointing to Table A's primary key. I was using Postgres so the exact queries may differ but the overall approach is the same. The biggest benefit of this approach is that it can be used recursively to go infinitely deep

    TLDR: the approach looks like this

    1) find all the related table/columns of Table A
    2) copy the necessary data into temporary tables
    3) create a trigger and function to propagate primary key column 
       updates to related foreign keys columns in the temporary tables
    4) update the primary key column in the temporary tables to the next 
       value in the auto increment sequence
    5) Re-insert the data back into the source tables, and drop the 
       temporary tables/triggers/function
    

    1) The first step is to query the information schema to find all of the tables and columns which are referencing Table A. In Postgres this might look like the following:

    SELECT tc.table_name, kcu.column_name
    FROM information_schema.table_constraints tc
    JOIN information_schema.key_column_usage kcu
    ON tc.constraint_name = kcu.constraint_name
    JOIN information_schema.constraint_column_usage ccu
    ON ccu.constraint_name = tc.constraint_name
    WHERE constraint_type = 'FOREIGN KEY'
    AND ccu.table_name=''
    AND ccu.column_name=''
    
    
    

    2) Next we need to copy the data from Table A, and any other tables which reference Table A - lets say there is one called Table B. To start this process, lets create a temporary table for each of these tables and we will populate it with the data that we need to copy. This might look like the following:

    CREATE TEMP TABLE temp_table_a AS (
        SELECT * FROM 
    WHERE ... ) CREATE TEMP TABLE temp_table_b AS ( SELECT * FROM
    WHERE IN ( SELECT FROM temp_table_a ) )

    3) We can now define a function that will cascade primary key column updates out to related foreign key columns, and trigger which will execute whenever the primary key column changes. For example:

    CREATE OR REPLACE FUNCTION cascade_temp_table_a_pk()
    RETURNS trigger AS
    $$
    BEGIN
       UPDATE  SET  = NEW.
       WHERE  = OLD.;
    
       RETURN NEW;
    END;
    $$ LANGUAGE plpgsql;
    
    CREATE TRIGGER trigger_temp_table_a
    AFTER UPDATE
    ON 
    FOR EACH ROW
    WHEN (OLD. != NEW.)
    EXECUTE PROCEDURE cascade_temp_table_a_pk();
    

    4) Now we just update the primary key column in to the next value of the sequence of the source table (). This will activate the trigger, and the updates will be cascaded out to the foreign key columns in . In Postgres you can do the following:

    UPDATE 
    SET  = nextval(pg_get_serial_sequence('
    ', ''))

    5) Insert the data back from the temporary tables back into the source tables. And then drop the temporary tables, triggers, and functions after that.

    INSERT INTO 
    (SELECT * FROM ) INSERT INTO
    (SELECT * FROM ) DROP TRIGGER trigger_temp_table_a DROP cascade_temp_table_a_pk()

    It is possible to take this general approach and turn it into a script which can be called recursively in order to go infinitely deep. I ended up doing just that using python (our application was using django so I was able to use the django ORM to make some of this easier)

    提交回复
    热议问题