Information is uncertainty, surprise, difficulty, and entropy.
—James Gleick
Everything tends toward entropy. Entropy means disorder and chaos. Entropy means the final state of everything. Sooner or later everything will be in the state of entropy. That state will be totally random and unable to produce any meaningful work anymore. The energy left in the system is completely dissipated. In short it's pretty close to our understanding of chaos. In general entropy is a concept used in many varying fields, such as physics, information theory, and thermodynamics. It always refers to a measure of disorder or randomness in a system. read more…