

0 / 0


entropy

noun
- (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
Synonyms: information, selective information - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
Synonyms: randomness, S
![]() |
![]() |
Show Complete Word Source Live Entry