0

I have a pipe delimited file that is too large to open in Excel. I'm trying to import this file into MSSQL using the import wizard in SSMS.

Normally when I do this, I open the file in Excel and use an array function =MAX(LEN(An:Annnn)) to get the max length of each column. Then I use that to specify the size of each field in my table.

This file is too large to open in Excel and SQL doesn't check all of the data to give an accurate suggestion (I think it's a crazy small sample like 200 records).

Anyone have a solution to this (I'm not opposed to doing something in Linux especially if it's free).

Thanks in advance for any help.

pnuts
  • 58,317
  • 11
  • 87
  • 139
user1038638
  • 85
  • 1
  • 5
  • Create a temporary table, with the columns set to the max possible size for each column and it's datatype. Import into that, and then you can do the actual import from that temporary table. – Ken White Jul 06 '13 at 01:20

1 Answers1

0

When I import text data into a database, typically I first read the data into a staging table where are the columns are long-enough character fields (say varchar(8000)).

Then, I load from the staging table into the final table:

create table RealTable (
    RealTableId int identity(1, 1) primary key,
    Column1 int,
    Column2 datetime,
    Column3 varchar(12),
    . . .
);

insert into RealTable(<all columns but id>)
    select (case when column1 not like '[^0-9]' then cast(column1 as int) end),
           (case when isdate(column2) = 1 then cast(column2 as datetime),
           . . .

I find it much easier to debug type issues inside the database rather than when inserting into the database.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786
  • Sorry for the delay in responding. I just realized I hadn't responded. I ended up adding a tool called FileQuery to my toolbox. It allows me to open really large text files and actually run some basic analysis queries against the text file. Thanks for the ideas! – user1038638 Jan 06 '17 at 23:17