The difference between megabytes and megabits

Megabit (Mbit) and megabyte (MB) sound confusingly similar. That’s why they like to be confused. TECHBOOK explains what the terms mean and how to use them correctly.

Mbit and MB, Megabit and Megabyte. You come across these two terms again and again when dealing with technology. And since smartphones, for example, are now an essential part of almost everyone’s everyday life, many should be able to do something with the words. Although the two terms sound similar and “mega” in this case means the same unit of measurement, they mean different things.

What is megabytes (MB)?

Simply put, megabytes are the size of a digital file. The basic unit of this unit of measurement is the byte. At the beginning of computer history, files were just a few bytes in size. With advances in technology, file sizes have also increased. This is how 1024 bytes are combined into one kilobyte (kB) and 1024 kilobytes into one megabyte. This can be compared to meters and kilometers.

In the meantime, photos taken with a smartphone are several MB and text files several kB in size, so that files in the byte range hardly ever appear in the everyday life of normal PC users. Megabytes and the next larger unit gigabytes (1 GB = 1024 MB) are more present than ever. Whether it’s the data volume on the smartphone, the download size of games or the storage capacity of hard drives – in all these cases we are talking about bytes or megabytes, gigabytes and even terabytes.

Also read: The difference between HDD and SSD storage

What is Megabit (Mbit)?

Like megabytes, megabits or bits are also a unit of measurement for a quantity of data. The ratio is 1:8 because 1 byte in today’s computer systems usually consists of 8 bits. In contrast to bytes, bits are also used to represent information content. Bit is an abbreviation for “Binary Digit”, which is a binary digit that represents the smallest possible distinction between two states.

Computer information always consists of two states: “On” and “Off” or 1 and 0. A bit is the carrier of just such information. Bits are combined into bytes in order to be able to store at least the smallest amounts of data. Historically, 8 bits are required to represent a single character of text, which is why this unit has become the standard for a byte. A bit is the smallest information and storage unit, while a byte is the smallest amount of data and is therefore used to describe the available space on a storage medium.

While memory sizes are specified in bytes, as already mentioned, bits are mainly used in connection with networks to identify data transmission speeds. But then always in bits per second, because bytes are always sent bit by bit through the Internet.

In everyday life, Internet users usually encounter bits in connection with the speed of DSL connections – here as Mbit/s or Mbps, i.e. megabits per second. In this area lies the greatest potential for error in confusing the units megabyte and megabit. Because a 50,000 DSL line promises 50 Mbit/s and not 50 megabytes per second, as some still think. Mbit/s must first be converted in order to be really meaningful for the general public.

Also interesting: WhatsApp uses so much data volume

How do I convert Mbit/s to MB/s?

Only very few can do anything with megabits per second, so the conversion to megabytes per second makes sense. Since a byte consists of eight bits, this is consequently not too difficult. Simply divide the Internet speed (eg 100 Mbit/s) by eight and you have already determined the maximum amount of data in megabytes that the line can download in the best possible case. In this example, that would be a maximum of 12.5 megabytes per second.

ttn-35