You can estimate the size required by a row summing up each column type size, then multiply by the number of rows. It should be accurate if you don't have TEXT / VARCHAR in your query:
int rowSize = 0;
foreach(DataColumn dc in Dataset1.Tables[0].Columns) {
rowSize += sizeof(dc.DataType);
}
int dataSize = rowSize * Dataset1.Tables[0].Rows.Count;
In case you need a more accurate figure, sum up the size of each individual value using Marshal.SizeOf:
int dataSize = 0;
foreach(DataRow dr in Dataset1.Tables[0].Rows)
{
int rowSize = 0;
for (int i = 0; i < Dataset1.Tables[0].Columns.Count; i++)
{
rowSize += System.Runtime.InteropServices.Marshal.SizeOf(dr[i]);
}
dataSize += rowSize;
}
Ideas for performance gain if high accuracy is not a concern:
- Compute the size of just a sample. Let's say, instead of iterating through all rows, pick 1 in every 100, then multiply your result by 100 in the end.
- Use [Marshal.SizeOf]((https://msdn.microsoft.com/en-us/library/y3ybkfb3.aspx) to compute the size of each DataRow
dr
instead of iterating through all it's values. It will give you a higher number since a DataRow object has additional properties, but that's something you can tweak by subtracting the size of an empty DataRow.
- Know the average size of a single row beforehand, by it's columns, and just multiply by the number of rows.