RANDOM
today
CONNECT WITH US ON
copyright © 2015 Perusen
Created by
July 18, 2015
Tagged in
1 post
3268 views
0 followers
share title
Entropy is, by the most popular definition, the measure of the "disorder" of a system. The scientific definition of entropy is the number of microscopic states that correspond to a given macroscopic state. For example, if you toss 100 coins, there are so many different combinations that yields total, let's say, 43 tails. The number of those different configurations that corresponds to the total number of 43 tail is the disorder of this system, hence the entropy of that state.

The fascinating thing about entropy is that natural process always cause an increase in entropy, in other words, leads to greater disorder. This is natural simply because there are many more random disordered configurations than ordered ones for given state. For example if you have 100 coins in a box, and if you shake the box you will most likely get half heads and half tails because there are many more arrangements that give 50 tails than 3 tails only.

(see: the second law of thermodynamics)
1.
by dem 2 years ago
hey
hey
Contribute to This Topic!
IMAGE
Drag and Drop
OR
Upload From:
Computer
Web