this post was submitted on 27 Dec 2023
147 points (69.6% liked)

Technology

59204 readers
3707 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It's about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

you are viewing a single comment's thread
view the rest of the comments
[–] PsychedSy@sh.itjust.works 7 points 10 months ago (9 children)

Kilo was used outside of decimal power rules for data storage/memory because it could only use binary powers at smaller scales. Well, that's the standard we went with anyway.

They didn't 'retcon' the use of kilo as applicable to other units, they went with the closest power of two. When hard drive manufacturers decided to use power of tens it confused people and eventually got standardized by making kb power of ten and kib power of two.

From the looks of it you aren't familiar with the situation.

[–] Eyron@lemmy.world -2 points 10 months ago* (last edited 10 months ago) (8 children)

This is all explained in the post we're commenting on. The standard "kilo" prefix, from the metric system, predates modern computing and even the definition of a byte: 1700s vs 1900s. It seems very odd to argue that the older definition is the one trying to retcon.

The binary usage in software was/is common, but it's definitely more recent, and causes a lot of confusion because it doesn't match the older and bigger standard. Computers are very good at numbers, they never should have tried the hijack the existing prefix, especially when it was already defined by existing International standards. One might be able to argue that the US hadn't really adopted the metric system at the point of development, but the usage of 1000 to define the kilo is clearly older than the usage of 1024 to define the kilobyte. The main new (last 100 years) thing here is 1024 bytes is a kibibyte.

Kibi is the recon. Not kilo.

[–] PsychedSy@sh.itjust.works 4 points 10 months ago* (last edited 10 months ago) (4 children)

I'm not sure if you just didn't read or what. It seems like you understand the history but are insistent on awkward characterizations of the situation.

Kibi is the recon. Not kilo.

I mean kibi is the retcon because it made all previous software wrong.

They didn't modify the use of kilo for other units - they used it as an awkward approximation with bytes. No other units were harmed in the making of these units.

And they didn't hijack it - they used the closest approximation and it stuck. Nobody gave a fuck until they bought a 300gb hd with 277gb of free space.

[–] Eyron@lemmy.world 1 points 10 months ago (1 children)

To me, your attempt at defending it or calling it a retcon is an awkward characterization. Even in your last reply: now you're calling it an approximation. Dividing by 1024 is an approximation? Did computers have trouble dividing by 1000? Did it lead to a benefit of the 640KB/320KB memory split in the conventional memory model? Does it lead to a benefit today?

Somehow, every other computer measurement avoids this binary prefix problem. Some, like you, seem to try to defend it as the more practical choice compared to the "standard" choice every other unit uses (e.g: 1.536 Mbps T1 or "54" Mbps 802.11g).

The confusion this continues to cause does waste quite a bit of time and money today. Vendors continue to show both units on the same specs sheets (open up a page to buy a computer/server). News still reports differences as bloat. Customers still complain to customer support, which goes up to management, and down to project management and development. It'd be one thing if this didn't waste time or cause confusion, but we're still doing it today. It's long past time to move on.

The standard for "kilo" was 1000 centuries before computer science existed. Things that need binary units have an option to use, but its probably not needed: even in computer science. Trying to call kilo/kibi a retcon just seems to be trying to defend the use of the 1024 usage today: despite the fact that nearly nothing else (even in computers) uses the binary prefixes.

[–] PsychedSy@sh.itjust.works 2 points 10 months ago

I don't think it's more practical. I think it's what emerged from researchers trying to refer to concepts. I prefer the clarified prefixes.

load more comments (2 replies)
load more comments (5 replies)
load more comments (5 replies)