entropy

[en-truh-pee] /ˈɛn trə pi/
noun
1.
Thermodynamics.
  1. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
  2. (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S.
2.
(in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
3.
(in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death)
4.
a doctrine of inevitable social decline and degeneration.
Origin
< German Entropie (1865); see en-2, -tropy
Related forms
entropic
[en-troh-pik, -trop-ik] /ɛnˈtroʊ pɪk, -ˈtrɒp ɪk/ (Show IPA),
adjective
entropically, adverb
Examples from the web for entropy
  • My opinion is that entropy explains the state of my house.
  • Physics tells us energy is conserved, but entropy increases.
  • Scientists may recall in this connection the second principle of thermodynamics — the law of growing entropy.
  • Life tries to push entropy in the opposite direction.
  • This time of year, I nearly come to terms with entropy.
  • All systems are in a state of entropy and disintegration.
  • Even before the strike, the country had entered an advanced state of entropy.
  • At each stage of development, humans ask the mighty machine if it knows how to reverse entropy.
  • Some go as far as to say that's what life does: surf on entropy.
  • The second holds that the default direction of everything is toward entropy.
British Dictionary definitions for entropy

entropy

/ˈɛntrəpɪ/
noun (pl) -pies
1.
a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvin S See also law of thermodynamics
2.
a statistical measure of the disorder of a closed system expressed by S = klog P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
3.
lack of pattern or organization; disorder
4.
a measure of the efficiency of a system, such as a code or language, in transmitting information
Word Origin
C19: from en-² + -trope
Word Origin and History for entropy
n.

1868, from German Entropie "measure of the disorder of a system," coined 1865 (on analogy of Energie) by German physicist Rudolph Clausius (1822-1888) from Greek entropia "a turning toward," from en "in" (see en- (2)) + trope "a turning" (see trope). Related: Entropic.

entropy in Medicine

entropy en·tro·py (ěn'trə-pē)
n.

  1. For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.

  2. A measure of the disorder or randomness in a closed system.


en·tro'pic (ěn-trō'pĭk, -trŏp'ĭk) adj.
entropy in Science
entropy
  (ěn'trə-pē)   
A measure of the amount of energy in a physical system not available to do work. As a physical system becomes more disordered, and its energy becomes more evenly distributed, that energy becomes less able to do work. For example, a car rolling along a road has kinetic energy that could do work (by carrying or colliding with something, for example); as friction slows it down and its energy is distributed to its surroundings as heat, it loses this ability. The amount of entropy is often thought of as the amount of disorder in a system. See also heat death.
entropy in Culture
entropy [(en-truh-pee)]

A measure of the disorder of any system, or of the unavailability of its heat energy for work. One way of stating the second law of thermodynamics — the principle that heat will not flow from a cold to a hot object spontaneously — is to say that the entropy of an isolated system can, at best, remain the same and will increase for most systems. Thus, the overall disorder of an isolated system must increase.

Note: Entropy is often used loosely to refer to the breakdown or disorganization of any system: “The committee meeting did nothing but increase the entropy.”
Note: In the nineteenth century, a popular scientific notion suggested that entropy was gradually increasing, and therefore the universe was running down and eventually all motion would cease. When people realized that this would not happen for billions of years, if it happened at all, concern about this notion generally disappeared.
entropy in Technology

theory
A measure of the disorder of a system. Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy).
The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way.
Shannon's formula gives the entropy H(M) of a message M in bits:
H(M) = -log2 p(M)
Where p(M) is the probability of message M.
(1998-11-23)