Entropy
The term for the opposite of information.It is a concept that was discovered fairly recently (the 19th century) by physicists. It is a magnitude that appeared naturally in the equations of thermodynamics; when physicists interpreted its meaning, they realized that it expresses the amount of information in a dynamical system, which they more and more realize everyday our physical world happens to be.
Basic theorems about such dynamical systems assert that in a closed system, entropy globally increases with time; that is, information is being lost, and the system evolves toward a state more and more indistinguishable, less specific, more probable.
This page is linked from: Information Liberty Optimize