Difference Between Megabyte and Megabit

Megabit and Megabyte are two units used to measure the amount of information in computer systems and network systems.

Megabit

Bit is the basic measure of the information used in the computing and telecommunication technology. It is an abbreviation for the Binary Digit. A bit can assume only two values; i.e 1 and 0. Megabit is a multiple of the basic unit, bit.

Mega is the prefix for the multiple of million (x106) within the international system of units. Therefore, megabit is equal to million of bits. Megabit is denoted using the symbols Mbit or Mb.

1 Megabit (Mb) = 1000000 bits = 106 bits = 1000 kilobits (kb)

Megabits are often used in the units measuring the data transfer rates of the computer networks. 100 Mbps means, 100 mega bits per second.

Megabyte

Byte, used in computing and telecom technology, is the collection of 8 bits. Therefore, each byte contains 8 bits within.

As indicated earlier, Mega refers to a multiple of million and, therefore, Megabyte means a million of bytes. Since each byte has 8 bits inside. It is equivalent to 8 million bits or 8 Megabits. The symbols MB and MByte are used to denote the Megabyte

1 Megabyte (MB) = 1000000 Bytes = 106 Bytes = 8 Megabits = 8× 106 bits

What is the difference between Megabyte and Megabit?

• 1 Megabyte is 8 Megabits.

• 1 Megabit is 1/8 of Megabyte or 125 kilobytes.

• Megabyte uses MB as the symbol, where B is in upper case; Megabit uses Mb as the symbol where b is in the lower case.

• The base units are bits and Bytes, and Mega is only a prefix used to denote the multiple of million by the International Standard Institution.