0

I have exported a list of accounts from Salesforce using their Dataloader tool. The output file is a CSV file. I have the table I want it imported into already created. I was using nvarchar(255) for all fields, but after I kept getting truncation errors I changed to nvarchar(max).

I am using the SQL Import Tool, and importing a flat file. I set it up with " for text qualifier, and comma separated. Everything looks good. Then when I go to import I kept getting truncation errors on nearly every field.

I went back and had it suggest type, and had it read the entire file.

I kept getting the same errors.

I went back and changed everything to DT_STR with length 255, and then instead of truncation errors, I get the following:

- Executing (Error)
Messages
Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "BILLINGSTREET" (86) to column "BILLINGSTREET" (636).  The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
 (SQL Server Import and Export Wizard)

Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[BILLINGSTREET]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[BILLINGSTREET]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
 (SQL Server Import and Export Wizard)

Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Data Conversion 0 - 0" (552) failed with error code 0xC0209029 while processing input "Data Conversion Input" (553). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
 (SQL Server Import and Export Wizard)

I went back AGAIN and changed everything to Stream Text. It's now working, but it's runningslow. What took less than a minute before will take probably 2 hours now.

FYI, I tried to import the csv into Excel but it either cuts off preceeding zeros, or completely screws up the parsing.

Thom A
  • 88,727
  • 11
  • 45
  • 75
Dizzy49
  • 1,360
  • 24
  • 35
  • If your column is an `nvarchar`, why are you using `DT_STR` and not `DT_WSTR`? If `DT_STR` is correct, then why are your columns not a `varchar`? – Thom A May 18 '20 at 11:06
  • @Lamu Two reasons... 1st, when I first imported it via Excel, the Import Tool created the table with nvarchar. I kept it nvar char because I had several fields that needed more than 400 characters, and there are a bunch of fields that have foreign characters as we work internationally. Was easier to keep it nvarchar than trying to figure out which fields MIGHT have foreign characters. – Dizzy49 May 18 '20 at 22:19
  • @Lamu I had float, double, boolean etc as well, but it kept giving me conversion errors. I went with the easiest method that actually WORKED. Not what SHOULD work, but actually did work. – Dizzy49 May 18 '20 at 22:20
  • What do you mean Excel chose them? You said before you're working with a CSV file, which meant that you define the data types, not Excel (or the ACE drivers more specifically). CSV files and xlsx files are completely different. On the other hand, a text file in the import process, assumes all data are `DT_STR` and would create `varchar` columns. This tells me that you *should* be using `DT_WSTR` because you do have characters that would otherwise be lost, and why you're getting the error. – Thom A May 19 '20 at 08:05

1 Answers1

0

What I ended up doing is importing the .csv as Flat File not the .xsl file. In the Advanced area I highlighted all of the columns on the right side and select DT_STR(255) The few fields I had that were longer than 255 I changed to D_TEXT

This is a workaround, it is not the "Proper" way to do it, but the "Proper" was just wasn't working due to bad data in the Salesforce Export. Once I got the data into the database I was able to review a lot easier and allowed me to identify the bad data.

Dizzy49
  • 1,360
  • 24
  • 35