I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
Yes computers are base 2 but we can still make up whatever rules we want about them. We could even make up rules that say that we are to consider everything a computer does to be in base 10 but it can only use the lowest 2 values of any given digit. It would be a total mess and it would make no sense whatsoever but we could define those rules.