3

For some reason I keep receiving the following error when trying to bulk insert a CSV file into SQL Express:

Bulk load data conversion error (type mismatch or invalid character for the 
specified codepage) for row 2, column 75 (Delta_SM_RR).
Msg 4864, Level 16, State 1, Line 89
Bulk load data conversion error (type mismatch or invalid character for the     
specified codepage) for row 3, column 75 (Delta_SM_RR).
Msg 4864, Level 16, State 1, Line 89
Bulk load data conversion error (type mismatch or invalid character for the   
specified codepage) for row 4, column 75 (Delta_SM_RR).
... etc.

I have been attempting to insert this column as both decimal and numeric, and keep receiving this same error (if I take out this column, the same error appears for the subsequent column).

Please see below for an example of the data, all data points within this column contain decimals and are all rounded after the third decimal point:

Delta_SM_RR
168.64
146.17
95.07
79.85
60.52
61.03
-4.11
-59.57
1563.09
354.36
114.78
253.46
451.5

Any sort of help or advice would be greatly appreciated as it seems that a number of people of SO have come across this issue. Also, if anyone knows of another automated way to load a CSV into SSMS, that would be a great help as well.

Edits:

Create Table Example_Table

(
  [Col_1] varchar(255),
  [Col_2] numeric(10,5),
  [Col_3] numeric(10,5),
  [Col_4] numeric(10,5),
  [Col_5] date,
  [Delta_SM_RR] numeric(10,5),
                              )

GO

BULK INSERT
Example_Table
FROM 'C:\pathway\file.csv'
WITH

(
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n',
    FIRSTROW = 2
);

Table Schema - This is a standalone table (further calculations and additional tables are built off of this single table, however at the time of bulk insert it is the only table)

ls101
  • 177
  • 5
  • 17
  • 3
    What does the table schema look like? What is the data type for the "destination" column? – digital.aaron Mar 17 '17 at 16:34
  • 3
    Include your table schema, the bulk insert commands you are using, the encoding of your source file, and format file (if you are using one). Also, which version of sql server you are you using? – SqlZim Mar 17 '17 at 16:41
  • Thank you both for your answers. I have made edits above to answer both of your queries. Also, I am using SQL Express 2012 – ls101 Mar 17 '17 at 17:28
  • Have you tried using a format file? (FMT or XML). Or try add rowterminator with 0x0a – Fabiano Carvalho Mar 17 '17 at 17:57
  • Fabiano, yes I have tried using rowterminator = 0x0a however the same problem persists. – ls101 Mar 17 '17 at 18:11
  • If you copy some sample data into SSMS and try an insert, does it work? If so, you may have some hidden characters in the file. If not, then it is likely the data you are uploading. – Andrew O'Brien Mar 17 '17 at 18:32
  • Yes I tried that as well. If I manually do a task to insert the data all fields import correctly. Another example is that the bulk insert does not accept my Date field as a date or datetime, even though I checked to ensure that all data within the Date column are in the MM-DD-YYYY format. – ls101 Mar 17 '17 at 21:14
  • 1
    I would check for scientific notation in your data as well – S3S Mar 19 '17 at 14:50
  • Scimon - It was scientific notation that was the source of the area. Thank you all for your input. – ls101 Mar 20 '17 at 16:49

3 Answers3

2

It's likely that your data has an error in it. That is, that there is a character or value that can't be converted explicitly to NUMERIC or DECIMAL. One way to check this and fix it is to

  1. Change [Delta_SM_RR] numeric(10,5) to [Delta_SM_RR] nvarchar(256)
  2. Run the bulk insert
  3. Find your error row: select * from Example_Table where [Delta_SM_RR] like '%[^-.0-9]%'
  4. Fix the data at the source, or delete from Example_Table where [Delta_SM_RR] like '%[^-.0-9]%'

The last statements returns/deletes rows where there is something other than a digit, period, or hyphen.

For your date column you can follow the same logic above, by changing the column to VARCHAR, and then find your error by using ISDATE() to find the ones which can't be converted.

S3S
  • 24,809
  • 5
  • 26
  • 45
0

I'll bet anything there is some weird character in your data set. Open your data set in Notepad++ and view the data. Any aberration should become apparent very quickly! The problem is coming from Col75 and it's affecting the first several rows, and thus everything that comes after that also fails to load.

ASH
  • 20,759
  • 19
  • 87
  • 200
0

Make sure that .csv is not using text qualifiers and that none of your fields in the .csv have a comma inside the desired value.

I am struggling with this issue right now. The issue is that I have a 68 column report I am trying to import.

Column 17 is a "Description" column that has a double quote text qualifier on top of the comma delimitation.

Bulk insert with a comma field terminator won't identify the double quote text qualifier and munge all of the data to the right of the offending column.

It looks like to overcome this, you need to create a .fmt file to instruct the Bulk Insert which columns it needs to treat as simple delimited, and which columns it needs to treat as delimited and qualified (see this answer).

halfer
  • 19,824
  • 17
  • 99
  • 186