this post was submitted on 02 Sep 2024
222 points (98.3% liked)
Programming
17416 readers
88 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ISO 8601 date format. Not because it's from a standards body, but because it's simple, sensible, clearly defined, easy to recognize, and very effective.
Date field placement in any order other than most-significant-digits-first is not only counterintuitive, but needlessly complicated to work with. Omitting critical information like the century is ambiguous and confusing.
We don't live in isolated villages any more. Mixing and matching those problems by accepting all the world's various regional and personal date styles, especially with no reliable indication of which ones apply in any given case, leads to the hodgepodge of error-prone date madness that we have today.
The 2024-09-02 format should be taught in schools and required in official documents. Let the antiquated date styles fall into disuse outside of art and personal correspondence, like cursive writing.
I love this standard. If you dig deeper into it, the standard also covers a way to express intervals and periods. E.g. "P1Y2M10DT2H30M" represents one year, 2 months, 10 days, 2 hours and 30 mins.
I recall once using the standard when writing a cron-style scheduler.
I also like the POSIX "seconds since 1970" standard, but I feel that should only be used in RAM when performing operations (time differences in timers etc.). It irks me when it's used for serialising to text/JSON/XML/CSV.
Also: Does Excel recognise a full ISO8601 timestamp yet?
I've seen bugs where programmers tried to represent date in epoch time in seconds or milliseconds in json. So something like "pay date" would be presented by a timestamp, and would get off-by-one errors because whatever time library the programmer was using would do time zone conversions on a timestamp then truncate the date portion.
If the programmer used ISO 8601 style formatting, I don't think they would have included the timepart and the bug could have been avoided.
Use dates when you need dates and timestamps when you need timestamps!
Thats an issue with the time library, not with timestamps. Actually timestamps are always in UTC, you need to do the conversion to your local time when displaying the value. There should be no possible off-by-one errors, unless you are doing something really wrong.