entropy (ĕnˈtrəpē) [key], quantity specifying the amount of disorder or randomness in a system bearing energy or information. Originally defined in thermodynamics in terms of heat and temperature, entropy indicates the degree to which a given quantity of thermal energy is available for doing useful work—the greater the entropy, the less available the energy. For example, consider a system composed of a hot body and a cold body; this system is ordered because the faster, more energetic molecules of the hot body are separated from the less energetic molecules of the cold body. If the bodies are placed in contact, heat will flow from the hot body to the cold one. This heat flow can be utilized by a heat engine (device which turns thermal energy into mechanical energy, or work), but once the two bodies have reached the same temperature, no more work can be done. Furthermore, the combined lukewarm bodies cannot unmix themselves into hot and cold parts in order to repeat the process. Although no energy has been lost by the heat transfer, the energy can no longer be used to do work. Thus the entropy of the system has increased. According to the second law of thermodynamics, during any process the change in entropy of a system and its surroundings is either zero or positive. In other words the entropy of the universe as a whole tends toward a maximum. This means that although energy cannot vanish because of the law of conservation of energy (see conservation laws), it tends to be degraded from useful forms to useless ones. It should be noted that the second law of thermodynamics is statistical rather than exact; thus there is nothing to prevent the faster molecules from separating from the slow ones. However, such an occurrence is so improbable as to be impossible from a practical point of view. In information theory the term entropy is used to represent the sum of the predicted values of the data in a message.
More on entropy from Fact Monster:
See more Encyclopedia articles on: Physics