I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

  • Lmaydev@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    1 year ago

    It’s actually a decimal Vs binary thing.

    1000 and 1024 take the same amount of bytes so 1024 makes more sense to a computer.

    Nothing to do with metric as computers don’t use that. Also not really to do with units.