This is the best description of entropy and information I've read.
Most of all, it highlights the subjective / relative foundations of these concepts.
Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.
It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")
The Second Law now tells me that the number of interesting things I don't know about is always increasing!
Information is the removal of uncertainty. If it does not remove uncertainty it is not information.
Entropy is the existential phenomenon of potential distributing over the infinite manifold of negative potential.
Emergence is a potential outcome greater than the capacity found in the sum of any parts.
Modern humanity’s erroneous extrapolations:
- asserting P>=0 without account that in existential reality 0 is the infinite expanse of cosmic void, thus the true mathematical description would be P>=-1
- confuse heat with entropy. Heat is the ultimate universal expression as heat is a product of all work and all existence is winding down (after all). Entropy directs thermodynamics, thermodynamics is not the extent of entropy.
- entropy is NOT the number of possible states in a system. Entropy is the distribution of potential; number of states are boundary conditions which uncalculated potential may reconfigure (the “cosmic ray” or murfy’s rule of component failure.) Existential reality is interference and decay.
- entropy is not “loss”. Loss is the entropy less work achieved.
- this business about “in a closed system “ is an example of how brilliant minds lie to themselves. No such thing exists anywhere accessible by Man. Even theoretically, the principles of decay and the “exogenous” influence of one impercieved influence over a “contained system.” Or “modeled system”, for one self deception is for the scientist or engineer to presume these speak for or on behalf of reality.
Emergence is the potential (the vector space of some capacity) “created” through some system of dynamics (work). “Some” includes the expressive space of all existential or theoretical reality. All emergent potential is “paid for” by burning available potential of some other kind. In nature the natural forces induce work in their extremes. In natural systems these design for the “mitigation of uncertainty” [soft form entropy], aka “intelligence.
Entropy is the existential phenomenon of potential distributing over negative potential.
Information is the removal of uncertainty. If it does not remove uncertainty, it is not information.
Emergence is a potential outcome greater than the capacity found in the sum of any parts.
A recent post inspired me to post this.
This is the best description of entropy and information I've read.
Most of all, it highlights the subjective / relative foundations of these concepts.
Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.
It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")
The Second Law now tells me that the number of interesting things I don't know about is always increasing!
I offer a diverging view.
Information is the removal of uncertainty. If it does not remove uncertainty it is not information.
Entropy is the existential phenomenon of potential distributing over the infinite manifold of negative potential.
Emergence is a potential outcome greater than the capacity found in the sum of any parts.
Modern humanity’s erroneous extrapolations:
- asserting P>=0 without account that in existential reality 0 is the infinite expanse of cosmic void, thus the true mathematical description would be P>=-1
- confuse heat with entropy. Heat is the ultimate universal expression as heat is a product of all work and all existence is winding down (after all). Entropy directs thermodynamics, thermodynamics is not the extent of entropy.
- entropy is NOT the number of possible states in a system. Entropy is the distribution of potential; number of states are boundary conditions which uncalculated potential may reconfigure (the “cosmic ray” or murfy’s rule of component failure.) Existential reality is interference and decay.
- entropy is not “loss”. Loss is the entropy less work achieved.
- this business about “in a closed system “ is an example of how brilliant minds lie to themselves. No such thing exists anywhere accessible by Man. Even theoretically, the principles of decay and the “exogenous” influence of one impercieved influence over a “contained system.” Or “modeled system”, for one self deception is for the scientist or engineer to presume these speak for or on behalf of reality.
Emergence is the potential (the vector space of some capacity) “created” through some system of dynamics (work). “Some” includes the expressive space of all existential or theoretical reality. All emergent potential is “paid for” by burning available potential of some other kind. In nature the natural forces induce work in their extremes. In natural systems these design for the “mitigation of uncertainty” [soft form entropy], aka “intelligence.
Entropy is the existential phenomenon of potential distributing over negative potential.
Information is the removal of uncertainty. If it does not remove uncertainty, it is not information.
Emergence is a potential outcome greater than the capacity found in the sum of any parts.