In gcc, there is software emulation for hardware floating point arithmetic so that compile time constants may be evaluated for any target architecture (even if the compiling architecture does not support that format). It seems go approximates this as “just evaluate with a high precision then convert to float” which is probably mostly fine but having arithmetic be different between compile time and run time seems likely to be not fun.