0

All, I have developed an export mechanism to allow the selected tables from a givem database to be exported to .csv files; I can do this using bcp or sqlcmd. For large tables (> 1-2GB), I want to split the tables into several .csv files of a predetermined, user specified size (say 200MB).

How can I determine the number of rows that make up the users specified size? This is so I can use a sqlcmd command like the following

DECLARE @sql VARCHAR(8000);
SELECT @sql = 'sqlcmd -S' + @@SERVERNAME + ' -E -Q ' + 
              '"SELECT * FROM Ia.dbo.[Episode];" -o "F:\aaData\Test2.csv" -s"," -W';
EXEC master..xp_cmdshell @sql;

with a TOP N clause to get the correct split .csv size. I may have to count the rows using a batch read, or is there a better way to do this?

BenMorel
  • 34,448
  • 50
  • 182
  • 322
MoonKnight
  • 23,214
  • 40
  • 145
  • 277

1 Answers1

1

If you want your exported files to be as close to the user's specified size as possible, you're going to need to resort to a more programmatic approach. Consider:

public void Export( int fileSize )
{
    SqlDataReader reader = // command to return the recordset to export...

    int bytesLeft = fileSize;

    // open first output file
    while ( reader.Read() )
    {
        // write the row
        bytesLeft -= // length of the row you just wrote

        if ( bytesLeft <= 0 )
        {
            // close this file
            // open the next file
            bytesLeft = fileSize;
        }
    }

    // close the last file.
}

This answer is intended to emphasize the mechanics of splitting a file that you're exporting. The details of reading data and writing a text file I've left out, but you can get more information on those by reading about SqlDataReader and writing text files.

Bob Kaufman
  • 12,864
  • 16
  • 78
  • 107
  • I wrote a very neat class that splits the files as a post processing step. It is based on this idea and is rapid! Thanks for your time... – MoonKnight Nov 02 '12 at 10:58