There seems to be a package out there for zipping on the client side, JSZip. Note you'd need Downloadify to then create the file on the user's computer. It doesn't look very cross-browser supported though, and the amount of data you're throwing around in JS on the client could cause issues.
Instead of sending a zip file, could you look at streaming a different archive format such as a TAR file or ISO file? It will just contain meta-data about the files and then the file data.
Alternatively, you could borrow a solution used by the 7digital and Bleep record music stores, which is to zip the files on the server to a temporary directory while presenting a page immediately to the user. The page uses a piece of JS on the client side to poll the server until the whole file is ready for download, then it can start that download as per normal.
Update
I noticed that if you download a directory from the DropBox website it starts the download immediately and does not know the full file size - which indicates that it's starting the download before it's finished creating the archive. A further read into the zip file format and the DEFLATE algorithm suggests that you can start generating your compressed data and streaming it to the client before you have the full file data from the service.
The code would look something like the following untested and simplified example: (using DotNetZip class names)
// Get a stream to the client
using (var zipStream = ZipOutputStream(Response.OutputStream)) {
foreach (var filename in filenames) {
// Write file header
ZipEntry entry = new ZipEntry(filename);
zipStream.PutNextEntry(entry);
// Write file chunks
byte[] chunk;
while ((chunk = service.GetChunk(filename)).Length > 0) {
zipStream.Write(chunk, 0, chunk.Length);
}
}
// Write zip file directory to complete file
zipStream.Finish();
}
If you want the files to be compressed further (which may be the case if you give the compressor larger blocks), but also want data streaming as soon as possible, and you know that data comes from the service to your application faster than it goes from your application to your client, you could implement some sort of exponential buffer within the foreach loop.
int chunksPerWrite = 1; // Better if this is defined outside of the foreach loop
byte[] chunk;
var chunks = new List<byte[]>();
while ((chunk = service.GetChunk(filename)).Length > 0) {
chunks.Add(chunk)
if (chunks.Count >= chunksPerWrite) {
// Combine all the chunks with some array copying logic not included
byte[] megaChunk = CombineAllChunks(chunks);
zipStream.Write(megaChunk, 0, megaChunk.Length);
chunksPerWrite *= 2; // or chunksPerWrite++ for a linear growth
}
}
// Cut for brevity - combine any last chunks and send to the zipStream.
My reading of the ZIP specification suggests there would be a limit to how much data can be effectively compressed in a single go, but I can't work out what that limit is (it might depend on the data?). I would be very interested to hear from anyone who knows the spec better...
If you find you need to roll your own for some reason, Zip files also have a plain storage mechanism with no compression engine, making it much easier if you're not concerned by bandwidth.