So I have a game that needs to read in a 1,600 kb text file and then a 2,400 kb text file. Decreasing either file is not an option, but I can recreate the files as binary files or excel files since they just contain data that my game needs.
My questions is whether changing the file types would result in the data being read in faster because at the moment it is taking quite a while to read in these text files line by line. Based upon other examples on this site, I believe I have made reading in the text as fast as possible, but I was hoping y’all might have some other suggestions for formats to try.
Binary files are much faster than deserializing text files, in most cases just for the simple fact that they are usually smaller because you’re specifying what you really need for each piece of data.
Checkout BinaryReader and BinaryWriter.
Creating a format where you can write just bits instead of bytes will make your data a lot smaller as well and therefore load faster. As an example, one generic way of thinking about that is reading and writing 7 bit encoded ints so small numbers take up less space than large ones.
private BinaryWriter m_writer;
private BinaryReader m_reader;
protected void write7BitEncodedInt(int value)
// Write out an int 7 bits at a time. The high bit of the byte,
// when on, tells reader to continue reading more bytes.
uint v = (uint) value; // support negative numbers
while (v >= 0x80)
m_writer.Write((byte) (v | 0x80));
v >>= 7;
protected int read7BitEncodedInt()
// Read out an int 7 bits at a time. The high bit
// of the byte when on means to continue reading more bytes.
int count = 0;
int shift = 0;
b = m_reader.ReadByte();
count |= (b & 0x7F) << shift;
shift += 7;
} while ((b & 0x80) != 0);
You don’t have to take it that far though … just reading and writing ints, bools, etc with the BinaryReader and BinaryWriter will get you loads of performance improvement over parsing text.