Main Page | See live article | Alphabetical index

Megabyte

A megabyte is a unit of measurement for computer storage, memory and information; while its exact definition varies, it is approximately equal to one million bytes. The abbreviation for megabyte is MB.

Three definitions for 1 MB are being used:

  1. 1 000 000 bytes or 106 bytes - this is the definition used by telecommunications engineers and storage manufacturers among others. It is consistent with the SI prefix "mega" and is endorsed by international standards bodies.
  2. 1 048 576 bytes - 1024 times 1024, or 220. This definition is often used in computer science and computer programming, when talking about the size of files or computer memory. The reason is that computers use the binary numeral system internally.
  3. 1 024 000 bytes - 1024 times 1000. This is an (erroneous) definition used by floppy disk manufacturers.

The definitions of the kilobyte (either 1000 = 103 or 1024 = 210 bytes) and of the gigabyte (either 109 or 230 bytes) have similar ambiguities.

To reduce the confusion and distinguish between meaning (1) and (2) above, the International Electrotechnical Commission (IEC), adopted an international standard in December 1998 which reserves the term megabyte for 106 bytes and introduces the new term mebibyte (abbreviated as MiB) for 220 bytes. Similarly, the terms kibibyte (KiB, equal to 210 bytes) and gibibyte (GiB, equal to 230 bytes) were introduced. These naming convention, while strongly endorsed by IEEE and CIPM, have not yet been widely accepted, and are simply ignored by most people.

Note the distinction between a megabyte (about one million bytes) and a megabit (about one million bits). A megabit is abbreviated as Mbit (preferably) or as Mb with a lower case "b". There are eight bits in one byte, so a megabyte (MB) is eight times as large as a megabit (Mb or Mbit). Megabits are often used in applications where a serial bitstream is the item of interest, particularly in communications and in specifying the internal data rate of a computer hard drive. In these contexts, one megabit is almost invariably defined as 106 bits. In practice, the abbreviation Mb is frequently encountered as a mistaken notation for MB. In most cases, an examination of the context will indicate which unit of measure was intended.

Similarly, a Gb or Gbit is a gigabit and a kb or kbit is a kilobit: these units too are often written in error when using the "b".

See also

External links