0

I'm going to be doing random-access reading from a read-only binary file. The interface to ifstream seems simpler than filebuf; but is there any use-case where filebuf would give better performance?

More details: I have a file of fixed-length (48-byte) records, and will be doing random-access reads in sequence -- read 1 record, process, read 1 record (from elsewhere), process, .... (Traversing a tree.) The file never changes. Since the records are fixed-length, I may later use a "character-type" that is the 48-byte record, but I don't imagine that has any performance effect.

  • 1
    Order of a million in the file -- traversing the tree will require that order of 50 get read. (Nearest neighbor search through a kb tree.) And you might then think I should read the whole file into RAM, since the file is only ~50MB -- but there will actually be about 80 such files, so the memory usage of reading all of the files would be very large. – Clayton Davis May 10 '12 at 14:54
  • Oops -- a kd tree, not kb tree. Typo. – Clayton Davis May 11 '12 at 02:50

1 Answers1

0

May be if you are on Linux may be using mmap would get around the whole problem of reading the file bit by bit.

Or boost memory mapped files? http://www.boost.org/doc/libs/1_52_0/libs/iostreams/doc/classes/mapped_file.html

CRABOLO
  • 8,605
  • 39
  • 41
  • 68
nishantjr
  • 1,788
  • 1
  • 15
  • 39