Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was surprised to learn that Linux has a “2262 problem” because of 64-bit time being used to store nanoseconds rather than seconds. That seems like a huge problem without an easy solution either. Yes there are almost 250 years to fix it but it seems like surprisingly bad planning. In any case it’s an interesting thought exercise to imagine what the fix should be.


Do you think we'll still be using the same computer systems then?


We’re only about 60-70 years into the age of computing and have no idea how long legacy software systems will truly last. Also imagine uses cases like exploratory or orbiting spacecraft which might have useful shelf lives into the triple digits.


Why is it bad planning? I can’t think of a better alternate plan.


It’s bad planning because 64-bit time stamps were supposed to provide coverage for “22 times the expected lifetime of the universe”, but by being the only OS to use those 64 bits to count nanoseconds, they only bought a couple more bits over 32-bit timestamps. What they could have done differently is not used nanoseconds for the epoch counter. Milliseconds seem like they would have offered the perfect balance.


Windows uses 100 nanosecond resolution for NTFS time stamps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: