ssis best practice to load N tables from Source to Target Server

后端 未结 3 471
梦谈多话
梦谈多话 2020-12-03 16:19

I need to load N (around 50) tables from a Source DB to a Target one. Each table is different from the others (so different metadata); I thought I could use a parent pkg to

3条回答
  •  遥遥无期
    2020-12-03 17:01

    there is a lot of factors which have impact on the what scenario to choose.

    But in general:

    For small tables with relatively few rows, you can put multiple sources/destinations in single data flow

    If you have complex ETL, for the source/destination, than it is better to put them to separate data flow tasks for clarity

    If you need to define the sequence of execution you have to use multiple data flow tasks, as you cannot control the order of execution for multiple sources/destinations in single data flow tasks.

    Whenever you need different transactional isolation level or behavior, you have to put them into separate data flows.

    Whenever you are unsure on the impact of the ETL on the source system put them in separate data flows as it will allow you to optimize the execution order in the future more easily.

    If you have large tables than put them into separate data flow tasks, as this will allow to optimize buffer sizes for different tables and optimize the ETL process for whatever reason

    So from the above if you have relatively small tables, and straight source/destination mapping, than there is no problem to have more source/destinations in single data flow.

    In other cases it is better or necessary to put them into separate data flows as it will allow you optimize the ETL process from all three points of view:

    Load impact on the Source systems

    Load impact on the destination systems

    Utilization of the machine on which the ETL process is running (CPU consumption, memory consumption and overall though output).

提交回复
热议问题