Depends what you're actually storing. There are plenty of cases where the timezone is not metadata; it defines how the datetime should be interpreted.
For example: your local mom and pop corner store's daily opening and closing times. Storing those in UTC is not correct because mom and pop don't open and close their store based on UTC time. They open and close it based on the local time zone.
You conflate different concepts here. The actual moment of opening and closing can be stored in UTC, because it's proper time. Scheduling algorithm is an algorithm, not time. You can use DSL similar to time to code this algorithm, but being DSL it can go only so far to implement it.
You don't need to store the timezone anywhere, you just need to know the current local timezone when the stored UTC time is used. And that's why storing in UTC is better, because it only takes one conversion to represent it in some arbitrary local time.
If you stored it as a local time (ie: with TZ), then if it's ever later translated to a different local time (different TZ), you're now dealing with all the quirks of 2 different timezones. It's great way to be off by some multiple of 15 minutes, or even a day or two!
Heck, even if it's the same exact location, storing in local time can still require conversion if that location uses daylight savings! You're never safe from needing to adapt to timezones, so storing datetimes in the most universal format is pretty much always the best thing to do.
nuget has a convention of system packages that are empty if the target platform implements functionality natively and provides independent implementations for platforms that don't support it, as a result you can unconditionally import that package on all platforms.
Local time is unparsable, and this case is only human readable, because humans can handle ambiguity ad hoc. Parsing it as UTC is a reasonable default for a machine parser, at least the only workable one.
It's called "the world wide web" and it works on the principle that a webpage served by computer A can contain links that point to other pages served by computer B.
Whether that principle should have been sustained in the special case of "B = localhost" is a valid question. I think the consensus from the past 40 years has been "yes", probably based on the amount of unknown failure possibilities if the default was reversed to "no".
owasp A01 addresses this: Violation of the principle of least privilege, commonly known as deny by default, where access should only be granted for particular capabilities, roles, or users, but is available to anyone.
Indeed, deny by default policy results in unknown failure possibilities, it's inherent to safety.
I completely agree with this, programs are too open most of the time.
But, this also brings up a conundrum...
Programs that are wide open and insecure typically are very forgiving of user misconfigurations and misunderstandings, so they are the ones that end up widely adopted. Whereas a secure by default application takes much more knowledge to use in most cases, even though they protect the end user better, see less distribution unless forced by some other mechanism such as compliance.
2. start civil war
3. ???
4. civil war with stone axes
reply