Year 2000 problem, Y2K problem, or millennium bug, in computer science, a design flaw in the hardware or software of a computer that caused erroneous results when working with dates beyond Dec. 31, 1999. In the 1960s and 70s programmers who designed computer systems dropped the first two digits of a year when storing or processing dates to save what then was expensive and limited memory; such a system recorded the year 2000 as 00 and could not distinguish it from 1900. In sorting, comparison, and arithmetic operations, the year 2000 would be treated as if it were equivalent to 0 rather than 100, causing incorrect results. The algorithm used to calculate leap years was also in some cases invalid, creating an additional problem in calculating the correct date after Feb. 28, 2000. Because the designers of such computer systems expected them to be replaced before the beginning of the year 2000, using a two-digit date was not regarded as a problem. Thousands of older computer systems, called legacy systems, were still in use in the 1990s, however, particularly in the finance and insurance industries, creating a potential operational and financial nightmare, which was termed Doomsday 2000. In the late 1990s business, government, and other computer users spent thousands of hours and millions of dollars to correct the Year 2000 problem, and only minor problems were experienced after Jan. 1, 2000.
The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2012, Columbia University Press. All rights reserved.