3

I am importing a CSV file into an existing Database and am getting a few errors

  • Executing (Error) Messages Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "supervisor" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)

Error 0xc020902a: Data Flow Task 1: The "Source - loc800members_csv.Outputs[Flat File Source Output].Columns[supervisor]" failed because truncation occurred, and the truncation row disposition on "Source - loc800members_csv.Outputs[Flat File Source Output].Columns[supervisor]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component. (SQL Server Import and Export Wizard)

Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\administrator.WDS\Desktop\loc800members.csv" on data row 83. (SQL Server Import and Export Wizard)

Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - loc800members_csv returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. (SQL Server Import and Export Wizard)

Here is a sample of the table I am importing.

https://i.stack.imgur.com/WRFbG.jpg

Here is the properties on the supervisor field

https://i.stack.imgur.com/E8Im5.jpg

Here are the columns in the table I am importing to.

i.imgur.com/mD5KlCC.jpg

eggwhites
  • 75
  • 3
  • 11

2 Answers2

3

From the looks of it, the fields in the database aren't long enough to support the data that you're trying to import. You'll need to either shorten the data or (probably more ideally) increase the size of the fields in the database. It looks like it's coming from the "supervisor" column (though you might want to double check your other columns to make sure that it doesn't happen elsewhere as well).

In a nutshell, what's happening, is it's attempting to import everything as is, and eventually hitting a field in your csv file that is too long to properly be copied over. Instead of chopping off the remaining data (truncating) it's instead throwing up an error and effectively giving up. I'm guessing the field in the database is a varchar or nvarchar type with a set size. You should be able to just jump the size of this up within the database to pull the data in. You might need to modify relevant stored procedures as well (if there are any), so the data isn't truncated there.

user2366842
  • 1,231
  • 14
  • 23
  • Here is the column info i have on this table. The supervisor field is either a Y or N so it is set to 1 http://i.imgur.com/mD5KlCC.jpg – eggwhites Jan 07 '15 at 19:37
  • 1
    Unfortunately I'm at work, and imgur.com is actually blocked. I'll take a look at this once I'm at home. My guess is, though, that you might have some blank spaces after your Y or N in the csv, which would put this over the limit. – user2366842 Jan 07 '15 at 19:39
3

You can change the field size in the wizard, the default is 50 characters which is often too small.

On the "choose a data source" screen, after you have have given the file location and checked off any format things you want to change, then click on advanced. For each field you will see the datatype and output column width. Change it to a larger value. I usually use 500 when looking at a file for the first time until I can see what the actual sizes are. To change all the column sizes at once, highlight the name of the first column and then hold down the shift key and click on the last column. Then change the size.

Alan M.
  • 3
  • 2
HLGEM
  • 94,695
  • 15
  • 113
  • 186
  • The supervisor column is a Y or N - see my comment to user2366842 to see the properties of this table's columns and you will see that the supervisor column is set to varchar(1) – eggwhites Jan 07 '15 at 19:53
  • Pull it into a staging table with a large column size, then you can look at the data and see what the problem is. Another thing that can cause size mismatches are unporntable characters like tabs or using the wrong row delimiter. – HLGEM Jan 07 '15 at 21:16
  • Thanks for the advice on staging - after I did this I was able to scroll through my 5k records and the mistake stood out like a sore thumb. Thank you kind internet warrior! – eggwhites Jan 07 '15 at 22:54