Main Page | See live article | Alphabetical index

Zero

This article is about the first and the common use of the word "zero". For other extended uses, please refer to zero (disambiguation).

Zero (0) is a number that precedes the number one and follows negative numbers.

Zero means nothing, null, void or an absence of value. For example, if the number of your brothers is zero, then you have no brothers. If the difference between the number of pieces in two piles is zero, it means the two piles have the same number of pieces.

In certain calendars it is common usage to omit the year zero when extending the calendar to years prior to its introduction: see proleptic Gregorian calendar and proleptic Julian calendar.

Table of contents
1 History
2 Mathematics
3 Computer science
4 See also
5 References

History

The numeral or digit zero is used in numeral systems, where the position of a digit signifies its value. Successive positions of digits have higher values, so the digit zero is used to skip a position and give appropriate value to the preceding and following digits.

By about 300 BCE, the Babylonians had started to use a basic numeral system and were using two slanted wedges to mark an empty space. However, this symbol did not have any true function other than to be a placeholder. The use of zero as a number unto itself was a relatively late addition to mathematics, first introduced by Indian mathematicians. An early study of the zero by Brahmagupta dates to 628.

Zero was also used as a numeral in Pre-Columbian Mesoamerica. It was used by the Olmec and subsequent civiliations; see also: Maya numerals.

Mathematics

Zero (0) is both a number and a numeral. The natural number following zero is one and no natural number precedes zero. Zero may or may not be counted as a natural number, depending on the definition of natural numbers.

In set theory, the number zero is the size of the empty set: if you do not have any apples, then you have zero apples. In fact, in certain axiomatic developments of mathematics from set theory, zero is defined to be the empty set.

The following are some basic rules for dealing with the number zero. These rules apply for any complex number x, unless otherwise stated.

Extended use of zero in mathematics

Computer science

Counting from 1 or 0?

Human beings usually count things from 1 not zero, yet in
computer science, zero has become the popular indication for a starting point. For example, in almost all old programming languages, an array starts from 1 by default, which is natural for humans. As programming languages have developed, it has become more common that an array starts from zero by default. This is because, with a one-based index, one must be subtracted to obtain a correct offset for things like obtaining the location of a specific element.

Distinguishing Zero from O

If your zero is centre-dotted and letter-O is not, or if letter-O looks almost rectangular but zero looks more like an American football stood on end (or the reverse), you're probably looking at a modern character display (though the dotted zero seems to have originated as an option on IBM 3270 controllers). If your zero is slashed but letter-O is not, you're probably looking at an old-style ASCII graphic set descended from the default typewheel on the venerable ASR-33 Teletype (Scandinavians, for whom Ø is a letter, curse this arrangement).

If letter-O has a slash across it and the zero does not, your display is tuned for a very old convention used at IBM and a few other early mainframe makers (Scandinavians curse this arrangement even more, because it means two of their letters collide). Some Burroughs/Unisys equipment displays a zero with a reversed slash. And yet another convention common on early line printers left zero unornamented but added a tail or hook to the letter-O so that it resembled an inverted Q or cursive capital letter-O.

The typeface used on European number plates for cars distinguish the two symbols by making the O rather egg-shaped and the zero more rectangular, but most of all by opening the zero on the upper right side, so here the circle is not closed any more.

"Zero" as a verb

In computing, zero is a default digit, meaning none and initial value. To zero (or zeroise or zeroize) a set of data means to set every bit in the data to zero (or off). Usually said of small pieces of data, such as bits or words (especially in the construction "zero out").

Zero means to erase; to discard all data from. This is often said of disks and directory, where "zeroing" need not involve actually writing zeroes throughout the area being zeroed. One may speak of something being "logically zeroed" rather than being "physically zeroed".

Null pointer in C programming language usually points to the memory address of zero.

See also

References


Sections of this article contains material from FOLDOC, used with permission.

simple:Zero