SQL Server - Running large script files

前端 未结 6 1378
礼貌的吻别
礼貌的吻别 2020-12-07 16:31

I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.

6条回答
  •  不知归路
    2020-12-07 16:51

    Yes we could do that, I tried with BCP(Bulk Copy Program) approach in order to avoid OutOfMemory issue.

    Note : Tried in SQLServer 2014

    In BCP, first we need to export the Source DataBase data to bcp file(in local directory folder) and then need to import that bcp file to Source DataBase

    Below are the cake walk steps:

    Note:

    a) Make sure empty table is present in Destination DataBase

    b) Make sure Temp folder is present in C drive

    1) Create a bat file named as Export_Data.bat with below command

    bcp.exe [Source_DataBase_Name].[dbo].[TableName] OUT "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 
    

    pause

    2) Run that bat file, as a result of that a bcp file will get generated in Temp folder

    3) Then Create a another bat file named as Import_Data.bat with below command

    bcp.exe [Destination_DataBase_Name].[dbo].[TableName] IN "C:\Temp\TableName.bcp" -S "Computer Name" -U "SQL Server UserName" -P "SQL Server Password" -n -q 
    

    Pause

    And here we go!!

提交回复
热议问题