## Entropy is the opposite of information, i.e., information decrease is synonymous with entropy increase. According to the second law of thermodynamics, the entropy in a closed system will be increased to maximum (total disorder), the information can be dissipated but not gained.

## A well structured and ordered situation which has lower entropy needs less information to describe it, lower entropy is also the result when adding information to a system.

## Maximum entropy presupposes that all probabilities are equal and independent of each other. Minimum entropy exists when only one possibility is expected to be chosen.

Föregående bild | Nästa bild | Tillbaka till första bilden | Visa grafisk version |