Timestamp ints get constantly casted back and forth between seconds, microseconds, and nano seconds. its an absolute mess everywhere. This are fixed point floats.
Everything sensor related, and countless logging, file or anywhere wait_for or timeouts are used are typical usecases, and indirectly whenever people need to do simple things like basic arithmetics on timestamps.
The code will either contain some precision change or a unfixed bug related to overflow of one or more kinds.
So int64 is half, and I was off by a factor of three doing it in my head... I was wrong no doubt about that. But the order of magnitude is the problem, it should be millenia, or millions of years, not a few centuries.
So int64 is half, and I was off by a factor of three doing it in my head... I was wrong no doubt about that. But the order of magnitude is the problem, it should be millenia, or millions of years, not a few centuries.