i am doing a bulk insert:
DECLARE @row_terminator CHAR;
SET @row_terminator = CHAR(10); -- or char(10)
DECLARE @stmt NVARCHAR(2000);
SET @stmt = \'
BULK I
I have a csv file that i import using Bulk
BULK INSERT [Dashboard].[dbo].[3G_Volume]
FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n'
)
GO
Usually I used this script and it has no problems but in rare occassions.
I encounter this error..
"The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error."
Usually, this happens when the last row have blank values(null).
You need to link your csv file in MS access db to check the data.. (If your csv is not more than 1.4million rows you can open it in excel)
Since my data is around 3million rows I need to use access db.
Then check the number of the last row with blanks and subtract the number of null rows to your total rows for csv.
if you have 2 blank rows at the end and the total number of rows is 30000005 The script will become like this..
BULK
INSERT [Dashboard].[dbo].[3G_Volume]
FROM 'C:\3G_Volume.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '","',
ROWTERMINATOR = '\n',
Lastrow = 30000003
)
GO
Cheers... Mhelboy
the rows generating this error don't have CHAR(10)
terminator or have unnecessary spaces
If CHAR(10) is the row terminator, I don't think you can put it in quotes like you are trying to in BULK INSERT. There is an undocumented way to indicate it, though:
ROWTERMINATOR = '0x0A'
I have a CSV file that I import using Bulk
You need to create one table and all columns should be nullable and remove space in the last row, add only those columns that available in excel. And please do not create a primary column, this process is not Identity increment automatically that's why creating the error.
I have done a bulk insert like this:
CREATE TABLE [dbo].[Department](
[Deptid] [bigint] IDENTITY(1,1) NOT NULL,
[deptname] [nvarchar](max) NULL,
[test] [nvarchar](max) NULL,
CONSTRAINT [PK_Department] PRIMARY KEY CLUSTERED
(
[Deptid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF,
ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
CREATE TABLE [dbo].[Table_Column](
[column1] [nvarchar](max) NULL,
[column2] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
BULK INSERT Table_Column
FROM 'C:\Temp Data\bulkinsert1.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR='\n' ,
batchsize=300000
);
insert into [dbo].[Department]
select column1,column2 from Table_Column
I ran into the same issue. I had written a shell script to create a .csv in Linux. I took this .csv to Windows and tried to bulk load the data. It did not "like" the commas.... Don't ask me why, but I changed to a * as a delimiter in the bulk import and performed a find and replace for comma with * in my .csv .. that worked.. I changed to a ~ as a delimiter, that worked... tab also worked- it didn't like the comma.... Hope this helps someone.
I got around the problem by converting all fields to strings and then using a common FIELDTERMINATOR. This worked:
BULK INSERT [dbo].[workingBulkInsert]
FROM 'C:\Data\myfile.txt' WITH (
ROWTERMINATOR = '\n',
FIELDTERMINATOR = ','
)
My data file looks like this now:
"01502","1470"
"01504","686"
"02167","882"
"106354","882"
"106355","784"
"106872","784"
The second field had been a decimal type with no double-quote delimiter (like , 1470.00) . Formatting both as strings eliminated the error.