zero

zero, that number which, when added to any number, leaves the latter unchanged; its symbol is 0. The introduction of zero into the decimal system was the most significant achievement in the development of a number system in which calculation with large numbers was feasible. Without it, modern astronomy, physics, and chemistry would have been unthinkable as we know them. The lack of such a symbol was one of the serious drawbacks of Greek mathematics. Its existence in the West is probably due to the Arabs, who, having obtained it from India, passed it on to European mathematicians in the latter part of the Middle Ages. The Maya of Central America and probably the Babylonians also invented zero, but they used the symbol as a placeholder rather than a true number; the Indians were the first to used zero as a number.

With the extension of the number system to negative as well as positive numbers, zero became the name for that position on the scale of integers between −1 and +1. It is used in this sense in speaking of zero degrees on the Fahrenheit and Celsius temperature scales; “absolute zero” is a term used by physicists and chemists to indicate the theoretically lowest possible temperature—a use reminiscent of zero as a symbol for nothing.

Unlike other numbers, zero has certain special properties in connection with the four fundamental operations. By definition zero added to or subtracted from any number leaves the number unchanged. Any number multiplied by zero gives zero. Zero multiplied by or divided by any number (other than zero) is still zero. But division by zero is undefined; i.e., there is no number that is the value of a number divided by zero.

See C. Seife, Zero (2000).

The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2024, Columbia University Press. All rights reserved.

See more Encyclopedia articles on: Mathematics