0

I read compressed data one block at a time and add these blocks to the queue. Then when I try to decompress that data using GzipStream InvalidDataException is shown stating that the magic number in the GzipHeader is incorrect. What header should my blocks have?

Here's my decompression code:

    private DataBlock Decompress(DataBlock sourceBlock)
    {
        var decompressed = new byte[sourceBlock.Data.Length];

        using (MemoryStream memory = new MemoryStream(sourceBlock.Data))
        {
            using (GZipStream gzip = new GZipStream(memory, CompressionMode.Decompress))
            {
                gzip.Read(decompressed, 0, sourceBlock.Data.Length);
            }

            return new DataBlock(decompressed, sourceBlock.Id);
        }
    }

Here's my compression code:

    private DataBlock Compress(DataBlock sourceBlock)
    {
        using (MemoryStream memory = new MemoryStream())
        {
            using (GZipStream gzip = new GZipStream(memory, CompressionMode.Compress))
            {
                gzip.Write(sourceBlock.Data, 0, sourceBlock.Data.Length);
            }

            return new DataBlock(memory.ToArray(), sourceBlock.Id);
        }
    }

Read blocks from file:

    private void ReadDataFromFile()
    {
        if (File.Exists(this.sourceFilePath))
        {
            try
            {
                using (var fsSource = new FileStream(this.sourceFilePath, FileMode.Open, FileAccess.Read))
                {
                    while (fsSource.CanRead)
                    {
                        var buffer = new byte[this.bufferLength];

                        int n = fsSource.Read(buffer, 0, buffer.Length);

                        if (n == 0)
                        {
                            break;
                        }

                        var block = new DataBlock(buffer, this.currentReadBlockNumber++);

                        lock (this.innerDataBlockQueue)
                        {
                            innerDataBlockQueue.Enqueue(block);
                        }

                        this.newBlockDataLoaded?.Invoke(this, EventArgs.Empty);

                        SpinWait.SpinUntil(() => this.QueueIsFull(this.innerDataBlockQueue) == false);
                    }
                }
            }
            catch (Exception e)
            {
                throw e;
            }
        }
    }
clarkitect
  • 1,720
  • 14
  • 23
Alex
  • 63
  • 1
  • 10
  • 1
    I don't understand how this works. Decompress doesn't call ReadDataFromFile, and ReadDataFromFile doesn't call Decompress. Seems like you've omitted some relevant code. – Robert Harvey Nov 18 '16 at 17:24
  • You will need to prefix the compressed segments with their length. Also, you need to ensure they are written in order. What you want becomes easier with PLINQ: `WriteSegments(ReadSegments().AsParallel().AsOrdered().Select(bytes => Compress(bytes)));`. Dataflow actually is more awkward than PLINQ for simple pipelines. – usr Nov 18 '16 at 17:33
  • All this causes in separate threads – Alex Nov 18 '16 at 17:38
  • Your code: `var decompressed = new byte[sourceBlock.Data.Length];` - Are you sure that the decompressed data size is equals to the compressed data size? – Alexander Petrov Nov 18 '16 at 17:58
  • It did not help. I read that it requires a header, which is read in 1 block, while all other units are already without him, and so I get an error. tried add handles the header files, not helped – Alex Nov 20 '16 at 08:48

0 Answers0