Query pulling 12-15 GB data From more than 120 tables

半腔热情 提交于 2020-01-06 05:40:20

问题


I have a query which is pulling data from almost 125 different tables, I have created some 13 nested Stored Procedures calling other stored procedures to pull all the required data. Surprise Surprise Surprise The query takes ages to execute and sometimes I have to kill he connection and rerun it.

I have been advised to make use of staging table, Move required data there using SSIS packages and pull data from there, but I am a bit reluctant to use SSIS as Im not very comfortable with SSIS and This report is requested once in a while and also moving around 10-15 gb data for one report seems a lot of hassle.

Any suggestion any ideas please to make this hell of task a bit simpler, quicker and less error prone ???


回答1:


Create a reporting database. On some frequency, be that hourly, daily, or whatever frequency meets the needs of the reports users, ETL the data from the transactional database into the reporting database.

You can use SSIS or you could choose to execute some stored procedures for ETL. Regardless, you probably will schedule it with a SQL Agent Job.

Finally, in terms of designing your report database, consider transforming the data in a way that will help the reports performance. Many people "flatten" or de-normalize data for the purpose of reporting. We ETL transactional data into a data warehouse that uses the "star schema" pattern and we also have an Analysis Services Database and MDX Reports as well. Most likely you don't need to go that far for one report, but, that is further down this same path of optimized data structures for reporting and BI.



来源:https://stackoverflow.com/questions/19071630/query-pulling-12-15-gb-data-from-more-than-120-tables

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!