Main Page | See live article | Alphabetical index

Gigabyte

A gigabyte is a unit of measurement in computers of approximately one thousand million bytes, (the same as one billion bytes in the American usage), roughly 1024 or 1000 megabytes. The abbreviation for gigabyte is GB.

Because of irregularities in definition and usage of the kilobyte, the exact number could be any of the following:

  1. 1 073 741 824 bytes - 1024 times 1024 times 1024, or 230. This is the definition used in computer science, computer programming and most all computer operating systems.
  2. 1 000 000 000 bytes or 109 - this is the SI definition used by telecommunications engineers and some storage manufacturers.

See integral data type.

A terabyte is equal to 1024 or 1000 gigabytes.

In speech, gigabyte is often abbreviated to gig, as in "This hard drive has 10 gigs". The initial G in giga- is usually pronounced hard as in girl, not soft as in giant. However, in the 1985 movie Back to the Future, the term giga was pronounced as jiga. A gigabit, which should not be confused with gigabyte, is 1/8th of a gigabyte and is mainly used to describe bandwidth, e.g. 2 gigabit/s is the speed of current Fibre Channel interfaces.

To clarify the meaning (1) above, the International Electrotechnical Commission (IEC), a standards body, in 1997 proposed short unions of the International System of Units (SI) prefixes with the word "binary." Thus meaning (1) would be called a gibibyte (GiB). This naming convention has not yet been widely accepted.

External links