2

Below is my code.

I set the content-encoding header. Then write the file stream, to memory stream, using gzip encoding. Then finally return the memory stream.

However, the android, IOS, and webbrowser all recieve corrupt copies of the stream. None of them are able to fully read through the decompressed stream on the other side. Which vital part am I missing?

   public Stream GetFileStream(String path, String basePath)
    {
        FileInfo fi = new FileInfo(basePath + path);

        //WebOperationContext.Current.OutgoingResponse.ContentType = "application/x-gzip";
        WebOperationContext.Current.OutgoingResponse.Headers.Add("Content-Encoding","gzip");

        MemoryStream ms = new MemoryStream();
        GZipStream CompressStream = new GZipStream(ms, CompressionMode.Compress);

        // Get the stream of the source file.
        FileStream inFile = fi.OpenRead();

        // Prevent compressing hidden and   already compressed files.
        if ((File.GetAttributes(fi.FullName) & FileAttributes.Hidden)
            != FileAttributes.Hidden & fi.Extension != ".gz")
        { 
                    // Copy the source file into the compression stream.
                    inFile.CopyTo(CompressStream);

                    Log.d(String.Format("Compressed {0} from {1} to {2} bytes.",
                        fi.Name, fi.Length.ToString(), ms.Length.ToString()));        
        }
        ms.Position = 0;
        inFile.Close();
        return ms;
 }
Mcloving
  • 1,390
  • 1
  • 13
  • 30

1 Answers1

1

I'd strongly recommend sending a byte array. Then on client side create a zip stream from the received byte array.

toATwork
  • 1,335
  • 16
  • 34
  • Sending a 3GB byte array is not ideal. Loading that much into memory is going to cause issues. Hence a stream is needed. – Mcloving Apr 22 '14 at 11:57
  • @Mcloving: Have you set the TransportMode? Have you increased the quotas[http://msdn.microsoft.com/en-us/library/ms731078.aspx]? Do you get any error messages? If not enable tracing on WCF [http://msdn.microsoft.com/en-us/library/ms733025%28v=vs.110%29.aspx]. – toATwork Apr 22 '14 at 13:44
  • Its not about the quotas, its about the physical memory avaliable on the server. You can set it as high as you want, but are still bound by the physical capiability. Files may range up to 16GB.. Not to mention the other sites/databases on the server. – Mcloving Apr 22 '14 at 13:51
  • @Mcloving: I am talking about WCF quotas. If you are using default values you will have problems transferring files larges than 65kB. – toATwork Apr 22 '14 at 14:00
  • My quotas are on a very high amount! They are not the issue. loading the file into more memory than is avaliable, is my issue :( – Mcloving Apr 22 '14 at 15:51
  • @Mcloving: Have you tried skipping the GZip Stream? And directly operating on FileStream? It seems you are zipping it all in memory - most likely also unzipping on client side (you keep it all in memory). That might be the issue. – toATwork Apr 22 '14 at 16:50