1 megabyte isn't particularly big.
A binary format will be more compact and faster, especially if you write your own rather than using the .net serialisation support, which adds a lot of overhead to the data.
If you want to stick with xml, you can usually improve performance significantly by using a brief, compact format:
- use short names for elements and attributes: e.g. v rather than vertexentry.
- use self-closing elements with data in attributes rather than cdata or child elements to contain single values. This usually works out more compact.
- if you have a list of simple values, consider using a single string value containing a comma-separated list, rather than lots of individual elements/attributes. e.g. use p="12,22" rather than x="12" y="22". This is less data to read, fewer items to parse, and halves the number of method calls to read values from the xml element/reader.
- only store useful precision. A double converted to a string uses a lot of digits. If you only need 3 decimal places of accuracy, only store 3d.p.
Profile and optimise your loading code - you may find bottlenecks that are nothing to do with xml. You may be able to defer some work, or do some data conversion processing on another thread, but beware of introducing big complexity for small gains.
Finally, try different approaches - XmlDocument rather than XmlReader, or a different library, or pre-loading the data into a MemoryStream. You may find improvements can be made there too.
Or just tell your boss it's because you don't have an eight core xeon with a terabyte of fast ssds... :-)