On the Unity blog the entry titled “Games Focus: Expanded scale for ambitious games“ mentioned cross-platform determinism. I am not sure what cross-platform determinism means. There are at least three possible interpretations. I assume it is in the broader sense of applying to the x86-64 and ARM architectures. Rather than in the narrower sense of instruction set architecture. Or does it refer to microarchitecture? For example, Intel vs. AMD, along with different generations of hardware from the same manufacturer? Does platform correspond to the Unity Build settings?
What kind of performance reduction compared to SIMD can we expect from platform agnostic floating points? IEEE 754 only guarantees determinism for a few math operations.