InformationA term for the measure of the improbability of the known state of a system. The more information we have, the more restricted is the space of possible actual system states as compared to some reference distribution of possibilities.
In computers, the basic unit of information is the bit.
Fundamental theorems about entropy tell us that global information decreases. But locally, it can increase by using the free part of energy in some disequilibrium, and life is such an engine that pumps entropy.
Note: information has a meaning only inside an evaluative context; depending on the abstraction or concreteness of the considered model, information can thus be either a scientific or a moral concept.
See the Ethics of information.
Page in this topic: Centralization
Also linked from: Administration Bit Buzzword Communication Computing Liberalism Cybernethics Cybernetics Entropy Ethics Fair Competition Holy Information-Flow Security Liberty LifeStreams Memory Networked Publicity Referential Transparency and State Security Server Tradition