Why does mono do this?

So I was wondering why my binarywriter was allocating garbage when it was running, and it’s because mono is apparently outright redirecting my binarywriter calls, into bitconverter calls.

#if MONO
            OutStream.Write (Mono.Security.BitConverterLE.GetBytes (value), 0, 4);
#else
            uint TmpValue = *(uint *)&value;
            _buffer[0] = (byte) TmpValue;
            _buffer[1] = (byte) (TmpValue >> 8);
            _buffer[2] = (byte) (TmpValue >> 16);
            _buffer[3] = (byte) (TmpValue >> 24);
            OutStream.Write(_buffer, 0, 4);

Why on earth does it do this? It doesn’t do this for integers or practically any other data type (maybe one other one). What is the reasoning behind this? If I tell unity that I want to use binarywriter, then I expect it to let me use binarywriter. I don’t see why it has to choose a slower serialization algorithm when that’s not at all what I want. Just a tad bit annoying, that’s all.

Ps. I tested the else code and it works fine in unity. Converts fine to byte array without creating any garbage, and deserializes fine too with binaryreader.

If I had to guess, it would probably be because of the unsafe pointers, right? But microsoft literally has the same code for the old binarywriter methods, so I don’t know.

Since Mono runs on more than just desktop PCs, it probably assumed that it cannot rely for in memory representation of “double” type. I’m not sure how valid that assumption is.

       // Writes a four-byte signed integer to this stream. The current position
        // of the stream is advanced by four.
        //
        public virtual void Write(int value)
        {
            _buffer[0] = (byte) value;
            _buffer[1] = (byte) (value >> 8);
            _buffer[2] = (byte) (value >> 16);
            _buffer[3] = (byte) (value >> 24);
            OutStream.Write(_buffer, 0, 4);
        }

That can’t be right, because here’s how it writes standard integers. Forces little endian. So you wouldn’t be able to use it on machines that still use big endian format regardless. I think all old consoles do.

Unless you’re talking about something different?

It’s not about endianess, but rather about how floating point numbers are stored in memory. While most CPU architectures follow IEEE 754, there are still subtle differences for denormal and NaN values between different architectures AFAIK.

So generally speaking, if my clients have any modern intel/AMD based computer, I can pretty much assume that it’s going to work fine on it?

Is there maybe a better alternative built into C# that would do float serialization/deserialization with no garbage being created?

I am not entirely sure! I guess so?