Entropy (noun) Definition, Meaning & Examples

noun
  1. Thermodynamics.
    • (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, and differing from energy in that energy is the ability to do work and entropy is a measure of how much energy is not available. The less work that is produced, the greater the entropy, so when a closed system is void of energy, the result is maximum entropy.
    • (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S
  2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
  3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death ).
  4. a state of disorder, or a tendency toward such a state; chaos.
  5. a doctrine of inevitable social decline and degeneration.
noun plural -pies
  1. a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvin
  2. a statistical measure of the disorder of a closed system expressed by S = k log P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
  3. lack of pattern or organization; disorder
  4. a measure of the efficiency of a system, such as a code or language, in transmitting information
    Entropy (noun) Definition, Meaning & Examples

    More Definitions