All, I have developed an export mechanism to allow the selected tables from a givem database to be exported to .csv files; I can do this using bcp
or sqlcmd
. For large tables (> 1-2GB), I want to split the tables into several .csv files of a predetermined, user specified size (say 200MB).
How can I determine the number of rows that make up the users specified size? This is so I can use a sqlcmd
command like the following
DECLARE @sql VARCHAR(8000);
SELECT @sql = 'sqlcmd -S' + @@SERVERNAME + ' -E -Q ' +
'"SELECT * FROM Ia.dbo.[Episode];" -o "F:\aaData\Test2.csv" -s"," -W';
EXEC master..xp_cmdshell @sql;
with a TOP N
clause to get the correct split .csv size. I may have to count the rows using a batch read, or is there a better way to do this?