SSIS - Export multiple SQL Server tables to multiple text files

自作多情 提交于 2019-12-11 23:03:58

问题


I have to move data between two SQL Server DBs. My task is to export the data as text (.dat) files, move the files and import into the destination. I have to migrate over 200 tables.

This is what I tried

1) I used a Execute SQL task to fetch my tables. 2) Used a For each loop to loop through the table names from the collection. 3) Used a script task inside the for each loop to build the text file destination path. 4) Called a DFT with the table name in a variable for the source ole db and the path name in a variable for the destination flat file.

First table extracts fine but the second table bombs with a synchronization error. I see this is numerous posts but could not find one that matches my scenario. Hence posting here.

Even if I get the package to work with multiple DFTs, the second table from the second DFT does not export columns because the flat file connection manager still remembers the first table columns. Is there a way to get it to forget the columns?

Any thoughts on how I can export multiple tables to multiple text files using one DFT using dynamic source and destination variable?

Thanks and appreciate your help.


回答1:


Unfortunately Bulk Import Task only enable us to use format files effectively to map the columns between source and destinations. Bulk Import Task uses BULK INSERT TSQL command to import the data, to execute user should have the BULKADMIN server privilege.

Most of the companies would not allow BULKADMIN server privilege to enable due to security reasons. Hence using the script task to construct BCP statements is a good and simple option to Export. You does not require to construct .bat file as script itself can execute dos commands which runs under .NET security account.




回答2:


I figured out a way to do this. I thought I will share if anybody is stuck in the same situation.

So, in summary, I needed to export and import data via files. I also wanted to use a format file if at all possible for various reasons.

What I did was

1) Construct a DFT which gets me a list of table names from the DB that I need to export. I used 'oledb' as a source and 'recordset destination' as target and stored the table names inside a object variable.

A DFT is not really necessary. You can do it any other way. Also, in our application, we store the table names in a table.

2) Add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.

3) Parse the variable one by one and construct BCP statements like below inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.

I loop through the tables and construct multiple BCP statements like this.

BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"

BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"

The statements are put inside a .bat file. This is also done inside the script task.

4) A execute process task will next execute the .BAT file. I had to do this because, I do not have the option to use the 'master..xp_cmdShell' command or the 'BULK INSERT' command in my company. If I had the option to execute cmdshell, I could have directly run the command from the package.

5) Again add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.

6) Parse the variable one by one and construct BCP statements like this inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.

I loop through the tables and construct multiple BCP statements like this.

BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"

BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"

The statements are put inside a .bat file. This is also done inside the script task.

The -b10000 was put so I can import in batches. Without this many of my large tables could not be copied due to less space in the tempdb.

7) Run the .bat file to import the file again.

I am not sure if this is the best solution. I still thought I will share what satisfied my requirement. If my answer is not clear, I would be happy to explain if you have any questions. We can also optimize this solution. The same can be done purely via VB Scripts but you have to write some code to do that.

I also created a package configuration file where I can change the DB name, server name, the data and format file locations dynamically.

Thanks.



来源:https://stackoverflow.com/questions/26302633/ssis-export-multiple-sql-server-tables-to-multiple-text-files

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!