0

For my project I need to write UInt16, UInt32, Bytes and Strings from a file. I started with a simple class I wrote like this:

    public FileReader(string path)  //constructor
    {
        if (!System.IO.File.Exists(path))
            throw new Exception("FileReader::File not found.");

        m_byteFile = System.IO.File.ReadAllBytes(path);
        m_readPos = 0;
    }
    public UInt16 getU16()   // basic function for reading
    {
        if (m_readPos + 1 >= m_byteFile.Length)
            return 0;

        UInt16 ret = (UInt16)((m_byteFile[m_readPos + 0])
                            + (m_byteFile[m_readPos + 1] << 8));
        m_readPos += 2;
        return ret;
    }

I thought it might be better to use the already existing BinaryReader though and so I tried it, but I noticed that it is slower than my approach. Can somebody explain why this is and if there is another already existing class I could use to load a file and read from it?

~Adura

Adura
  • 27
  • 4
  • How were you testing? Importantly, were you including the time taken to read all the data to start with in your code, given that the data is read piecemeal in BinaryReader? Also, do you *need* this code to be fast? – Jon Skeet Jun 18 '13 at 12:44
  • About the speed: I read images from a binary file and when browsing in the "image viewer" you could feel the difference it took to show all images. The time it takes to read the file / create a object of BinaryReader is negligible as it is basically instant in both cases. I read up to 4000bytes whenever a new image is selected, so I need it to be fast. The memory stream solution is ideal I think. – Adura Jun 18 '13 at 16:21
  • The point is that creating the BinaryReader *doesn't* read the file - it reads it as it goes. I very much doubt that File.ReadAllBytes is "basically instant" - IO never is. All you've done by using MemoryStream is front-load that. I would be interested to see how you're measuring the performance, as I suspect you *may* be fooling yourself. – Jon Skeet Jun 18 '13 at 16:27
  • With BinaryReader(new MemoryStream(File.ReadAllByte(path))) that instruction takes 40ms on average. BinaryReader(File.Open(path, FileMode.Open)) in contrast to that takes 0.1ms, but how does that benefit me if the actual processing in the end takes longer? – Adura Jun 18 '13 at 17:11
  • My point is that you need to take *all* the time into account - and without seeing anything about your testing methodology, it's hard to know whether you've really helped anything. It's very easy to get this sort of thing wrong - particularly when files are involved, as they can be cached by the OS making a second run always seem faster. – Jon Skeet Jun 18 '13 at 17:18

1 Answers1

1

You have all the data upfront in an array in memory whereas BinaryReader streams the bytes in one at a time from the source which I guess is a file on disk. I guess you could speed it up by passing it a stream that reads from an in-memory array:

Stream stream = new MemoryStream(byteArray);
//Pass the stream to BinaryReader

Note that with this approach you need to fill the entire file in memory at once but I guess that's ok for you.

Esailija
  • 138,174
  • 23
  • 272
  • 326
  • _This_ approach seems to deliver a performance equal to my attempt, but it already comes with the exceptions, so I guess I will use this. Thanks. – Adura Jun 18 '13 at 16:12