2

I have a problem whit an import from a TXT file to a Table in SQL Server 2016, that fails import all rows. I'm trying to read the data using OPENROWSET statement, whit an XML to define the correct format of the columns in the file.

So I use this script to read the data:

select *
FROM OPENROWSET (BULK 'C:\SQL_DATA\VAR_REG_20190831.TXT', 
    FORMATFILE = 'C:\SQL_DATA\MR_VAR_REG.xml'   AS t1;

it start, but at 18954 rows it finish successfully, but the file are made by 30000+ rows. The problem isn't in the data, because if i copy a recors that are imported, in the position 18.954, previous and next for secure the problem persist.

it's like there is a limit of rows to import. Please someone can explain me why SQL do that?

Thom A
  • 88,727
  • 11
  • 45
  • 75

1 Answers1

0

You can try to add two additional parameters that will help to troubleshoot the issue:

  • ERRORFILE
  • MAXERRORS

-- UPDATE --

You can try to make two calls. First, SQL #1. And after that SQL #2. The goal is bypass a supposedly bad line #18955 in the file.

SQL #1

SELECT *
FROM OPENROWSET (BULK 'C:\SQL_DATA\VAR_REG_20190831.TXT'
    , FORMATFILE = 'C:\SQL_DATA\MR_VAR_REG.xml'  
    , ERRORFILE = 'C:\SQL_DATA\MR_VAR_REG.err'
    , MAXERRORS = 100
    ) AS t1;

SQL #2

Here is a trick to bypass supposedly 'bad' line in the file.

SELECT *
FROM OPENROWSET (BULK 'C:\SQL_DATA\VAR_REG_20190831.TXT'
    , FORMATFILE = 'C:\SQL_DATA\MR_VAR_REG.xml'  
    , ERRORFILE = 'C:\SQL_DATA\MR_VAR_REG.err'
    , MAXERRORS = 100
    , FIRSTROW = 18956 -- real data starts on that row
    ) AS t1;
Yitzhak Khabinsky
  • 18,471
  • 2
  • 15
  • 21
  • thak for your answer. I tried your script, but SQL doesn't create the file ERRORFILE, because the select OPENROWSET dont generate an error, it return "Query finish successfully". – Antonio Calo Sep 03 '21 at 07:52
  • @AntonioCalo, I updated the answer. Please give it a shot. – Yitzhak Khabinsky Sep 03 '21 at 12:34
  • i tested it, only bad news... it return "0 rows affected"; it's like the file for SQL end at line 18955.... So, for test, i had created an hand made file to import, whit 90k same rows, all equal to first, to exclude data errors, but SQL import only 46.464 rows... – Antonio Calo Sep 03 '21 at 13:00
  • It seems that the line #18955 has some 'bad' chars. *.csv/*.txt files always have host of problems where column delimiters, column separators, invisible characters like null terminators \0, special characters based on encoding, and line breaks are in the middle of the actual data. The most reliable format for data feeds is XML enforced by XSD. An XSD (as a data contract between sender and receiver) will guarantee data quality. – Yitzhak Khabinsky Sep 03 '21 at 13:18