When did all software become "bits"?



I saw a blog entry I could really identify with today:

Did I miss a meeting of the "inappropriate computer terminology usage club" or something? Someone at work keeps referring to software as "bits" as in "the bits are on the server."

I thought it was strange, but now other people at work are also saying it. According to these folks, my archives, disk images, or Tarballs are no longer meaningful descriptors...

They are now BITS.

And the really funny thing about this is these people consider "bits" to be software, NOT data (which leads me to wonder what they believe those little 1's and 0's that make up our corporate data are called?)

Who started this? Was it some vendor sales rep or something? WHO, in the name of Dennis Ritchie, Alan Kay, Marc Blank came up with such a misuse of the term? I haven't seen such a mangling of computer terminology since the first day of my CS 101 class, many many years ago, where the instructor stated that data was a "raw collection of information".

Check out the full blog entry here

I too have noticed this stupid terminology.  The first time I saw it was when someone was posting an updated version of software, something like this:

Replace all of your May, 2006 bits with the latest bits.

How stupid is that?

I don't care if the person is talking about a snippet of code, or something small.  If it's a snippet, then call it a snippet.

I will never, ever call software "bits".  That terminology would work best in a gender-neutral kind of society, if you catch my drift.

Entry #155


This Blog entry currently has no comments.

Post a Comment

Please Log In

To use this feature you must be logged into your Lottery Post account.

Not a member yet?

If you don't yet have a Lottery Post account, it's simple and free to create one! Just tap the Register button and after a quick process you'll be part of our lottery community.