Yes and no. The thing is that it is quite convenient to track time in computers with monotonic clocks, but it also turns out that humans do not define time markers based on monotonic clocks (when far future times are considered) and just from time to time [re]define relationship between time markers and monotonic clocks.
By definition you cannot unambiguously tell how many seconds in the future Christmas in a few years will be. However, it turns out that as long as you have the monotonic clock synced, time marker definitions up to date and have reasonably accurate understanding what local time means you can rather easily tell whether `now` is "christmas in 3 years, when defined at 123456789 Unix time".
We, humans, define time markers based on astronomical phenomena relative to this particular floating rock. A system based on time markers (calendar days, holidays, full moons, whatever) can represent human times, but is very inconvenient for computers, because they are not necessarily well-defined. I mean, 12345679 Unix time is well-defined in itself even if it is not well-defined in relation to real world while "2025 May 3rd at noon" is simply not well-defined today. Timekeeping is extremely weird.
This is a trade-off. Either you store computer consumable time markers (monotonic clock values), but have to do processing to figure out how they relate real-world markers, or you store real-world markers and have to do processing to figure out how they relate to the monotonic clock. Sans DST, the first one is much more ergonomic.
DST is pain, but turns out it is pain in both systems. If you have future time marker and there is DST adjustment between `now` and `then` you have the very same problem: you don't know whether DST adjustment was taken into account when creating the marker. Fun thing is that if you do control (or can infer) adjustments, both systems become very similar, but the current is much more ergonomic.
By definition you cannot unambiguously tell how many seconds in the future Christmas in a few years will be. However, it turns out that as long as you have the monotonic clock synced, time marker definitions up to date and have reasonably accurate understanding what local time means you can rather easily tell whether `now` is "christmas in 3 years, when defined at 123456789 Unix time".
We, humans, define time markers based on astronomical phenomena relative to this particular floating rock. A system based on time markers (calendar days, holidays, full moons, whatever) can represent human times, but is very inconvenient for computers, because they are not necessarily well-defined. I mean, 12345679 Unix time is well-defined in itself even if it is not well-defined in relation to real world while "2025 May 3rd at noon" is simply not well-defined today. Timekeeping is extremely weird.
This is a trade-off. Either you store computer consumable time markers (monotonic clock values), but have to do processing to figure out how they relate real-world markers, or you store real-world markers and have to do processing to figure out how they relate to the monotonic clock. Sans DST, the first one is much more ergonomic.
DST is pain, but turns out it is pain in both systems. If you have future time marker and there is DST adjustment between `now` and `then` you have the very same problem: you don't know whether DST adjustment was taken into account when creating the marker. Fun thing is that if you do control (or can infer) adjustments, both systems become very similar, but the current is much more ergonomic.