this post was submitted on 14 Dec 2023
1018 points (99.2% liked)

xkcd

8973 readers
160 users here now

A community for a webcomic of romance, sarcasm, math, and language.

founded 2 years ago
MODERATORS
 

https://xkcd.com/2867

Alt text:

It's not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn't know and the Devil isn't telling.]

you are viewing a single comment's thread
view the rest of the comments
[–] Cosmonaut_Collin@lemmy.world 49 points 1 year ago (3 children)

This is why we should just move to a universal time zone and stop with the day light savings.

[–] nxdefiant@startrek.website 50 points 1 year ago* (last edited 1 year ago) (4 children)

We have that, it's called Unix time, and the only thing it doesn't account for is time dilation due to relativity.

it's perfect

[–] phoneymouse@lemmy.world 19 points 1 year ago (2 children)

If your system hasn’t been upgraded to 64-bit types by 2038, you’d deserve your overflow bug

[–] Appoxo@lemmy.dbzer0.com 8 points 1 year ago (1 children)

Let's just nake it 128-Bit so it's not our problem anymore.
Hell, let's make it 256-Bit because it sounds like AES256

[–] phoneymouse@lemmy.world 16 points 1 year ago* (last edited 1 year ago) (2 children)

64 bits is already enough not to overflow for 292 billion years. That’s 21 times longer than the estimated age of the universe.

[–] nybble41@programming.dev 13 points 1 year ago (1 children)

If you want one-second resolution, sure. If you want nanoseconds a 64-bit signed integer only gets you 292 years. With 128-bit integers you can get a range of over 5 billion years at zeptosecond (10^-21 second) resolution, which should be good enough for anyone. Because who doesn't need to precisely distinguish times one zeptosecond apart five billion years from now‽

[–] Hamartiogonic@sopuli.xyz 2 points 1 year ago* (last edited 1 year ago) (1 children)

If you run a realistic physical simulation of a star, and you include every subatomic particle in it, you’re going to have to use very small time increments. Computers can’t handle anywhere near that many particles yet, but mark my words, physicists of the future are going want to run this simulation as soon as we have the computer to do it. Also, the simulation should predict events billions of years in the future, so you may need to build a new time tracking system to handle that.

[–] nybble41@programming.dev 6 points 1 year ago

Good point. You'd need at least 215 bits to represent all measurably distinct times (in multiples of the Planck time, approximately 10^-43 seconds) out to the projected heat death of the universe at 100 trillion (10^14) years. That should be sufficient for even the most detailed and lengthy simulation.

[–] Faresh@lemmy.ml 7 points 1 year ago* (last edited 1 year ago)

With a 128 bit integer you can represent 340 undecillion (or sextillion if you use the long scale notation) seconds, which is equivalent to 10 nonillion (or quintillion, long scale) years. The universe will long have have stopped being able to support life by then because stars stopped forming (enough time would have passed it could have happened a hundred quadrillion (a hundred thousand billion, long form) times over assuming we start counting from the birth of the universe).

[–] Isthisreddit@lemmy.world 4 points 1 year ago

Cries in vintage computer collection tears.

You are a monster phoneymouse

I love the word "Epochalypse", from the wiki page you linked

[–] EatYouWell@lemmy.world 2 points 1 year ago

I thought that's what datetime was based off of, tbh.

[–] Cosmonaut_Collin@lemmy.world 1 points 1 year ago

I know, but it's not standard anywhere in the world.

Give it a few more decades

[–] The_Lurker@lemmy.world 4 points 1 year ago

Swatch's Internet Beats are making more and sense every time Daylight Savings forces a timezones change. Why are we still using base 12 for time anyway?