What’s the Diff: Megabits and Megabytes

Megabits vs. Megabytes

What is the difference between a megabit and a megabyte? The answer is obvious to computer people—it’s “a factor of eight,” since there are eight bits in a single byte. But there’s a lot more to the answer, too, involving how data moves, is stored, and the history of computing.

What Are Megabits?

“Megabit” is a term we use most often when talking about the speed of our internet connection. Megabits per second, or Mbps, is a measurement of data transfer speed. 1 Mbps is one million bits per second.

Take internet service providers, for example. My cable provider has upped my maximum download speed from 25 to 75 to 150 Mbps over the years. Fiber optic connections (Verizon’s FIOS, Google Fiber) can be much faster, where you can get the service.

What Is a Megabyte?

“Megabyte” is a measurement most often used to describe both hard drive space and memory storage capacity, though the term of art we throw around most frequently these days is the next order of magnitude, the gigabyte (GB). My computer has 8GB of RAM, for example, and 512GB of storage capacity.

How to Measure Megabits and Megabytes

A bit is a single piece of information, expressed at its most elementary in the computer as a binary 0 or 1. Bits are organized into units of data eight digits long—that is, a byte. Kilobytes, megabytes, gigabytes, terabytes, petabytes—each unit of measurement is 1,000 times the size before it.

So why does network bandwidth get measured in megabits, while storage gets measured in megabytes? There are a lot of theories and expositions about why. I haven’t found a hard answer yet, but the most reasonable explanation I’ve heard from networking engineers is that it’s because a bit is the lowest common denominator, if you will—the smallest meaningful unit of measurement to understand network transfer speed. As in bits per second. It’s like measuring the flow rate of the plumbing in your house.

As to why data is assembled in bytes, Wikipedia cites the popularity of IBM’s System/360 as one likely reason: The computer used a then-novel 8-bit data format. IBM defined computing for a generation of engineers, so it’s the standard that moved forward. The old marketing adage was, “No one ever got fired for buying IBM.”

Plausible? Absolutely. Is it the only reason? Well, Wikipedia presents an authoritative case. You’ll find a lot of conjecture, but few hard answers if you look elsewhere on the internet.

Which means aliens are behind it all, as far as I’m concerned.

What Does It All Mean

Anyway, here we stand today, with the delineation clear: Bandwidth is measured in bits, storage capacity in bytes. Simple, but what can be confusing is when we mix the two. Let’s say your network upload speed is 8 Mbps, that means that the absolute most you can upload is 1MB of data from your hard drive per second. Megabits versus megabytes, remember to keep the distinction in your head as you see how fast data moves over your network or to the internet.

print

About Peter Cohen

Peter will never give you up, never let you down, never run around or desert you. Follow Peter on his web site: peter-cohen.com | Twitter: @flargh | LinkedIn: Peter Cohen