I have a massive large dataset that contains almost 700 columns and I,m using GZipStream for compression and decompression. Compression works fine and size of the dataset after compression is almost 40mb, but I get "system out of memory" exception during decompression. I,m using below code for compression and decompression:
Compression:
public static Byte[] CompressDataSet(DataSet dataset)
{
Byte[] data;
MemoryStream mem = new MemoryStream();
GZipStream zip = new GZipStream(mem, CompressionMode.Compress);
dataset.WriteXml(zip, XmlWriteMode.WriteSchema);
zip.Close();
data = mem.ToArray();
mem.Close();
return data;
}
Decompression:
public static DataSet DecompressDataSet(Byte[] data)
{
MemoryStream mem = new MemoryStream(data);
GZipStream zip = new GZipStream(mem, CompressionMode.Decompress);
DataSet dataset = new DataSet();
dataset.ReadXml(zip, XmlReadMode.ReadSchema);
zip.Close();
mem.Close();
return dataset;
}
Please recommend any other compression library if GZipStream is not optimal/suitable for massive large datasets. Thanks in advance