An introduction to information theory and entropy

{\LARGE\bf An introduction to information theory and entropy}\newline \newline \newline

An introduction to information theory and entropy


Tom Carter



http://cogs.csustan.edu/~tom/SFI-CSSS

Complex Systems Summer School

June, 2002

Our general topics: Top

The quotes <-

Science, wisdom, and counting <-

``Science is organized knowledge. Wisdom is organized life.''

- Immanuel Kant

``My own suspicion is that the universe is not only stranger than we suppose, but stranger than we can suppose.''

- John Haldane

``Not everything that can be counted counts, and not everything that counts can be counted.''

- Albert Einstein (1879-1955)

``The laws of probability, so true in general, so fallacious in particular .''

- Edward Gibbon

Measuring complexity Top

Being different - or random <-

``The man who follows the crowd will usually get no further than the crowd. The man who walks alone is likely to find himself in places no one has ever been before. Creativity in living is not without its attendant difficulties, for peculiarity breeds contempt. And the unfortunate thing about being ahead of your time is that when people finally realize you were right, they'll say it was obvious all along. You have two choices in life: You can dissolve into the mainstream, or you can be distinct. To be distinct is to be different. To be different, you must strive to be what no one else but you can be. ''

-Alan Ashley-Pitt

``Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.''

- John von Neumann (1903-1957)

Some probability background Top

Surprise, information, and miracles <-

``The opposite of a correct statement is a false statement. The opposite of a profound truth may well be another profound truth.''

- Niels Bohr (1885-1962)

``I heard someone tried the monkeys-on-typewriters bit trying for the plays of W. Shakespeare, but all they got was the collected works of Francis Bacon.''

- Bill Hirst

``There are only two ways to live your life. One is as though nothing is a miracle. The other is as though everything is a miracle.''

- Albert Einstein (1879-1955)

Basics of information theory Top

Information (and hope) <-

``In Cyberspace, the First Amendment is a local ordinance.''

- John Perry Barlow

``Groundless hope, like unconditional love, is the only kind worth having.''

- John Perry Barlow

``The most interesting facts are those which can be used several times, those which have a chance of recurring. ... Which, then, are the facts that have a chance of recurring? In the first place, simple facts.''

H. Poincare, 1908

Some entropy theory Top

Several questions probably come to mind at this point:

H (or S) for Entropy <-

``The enthalpy is [often] written U. V is the volume, and Z is the partition function. P and Q are the position and momentum of a particle. R is the gas constant, and of course T is temperature. W is the number of ways of configuring our system (the number of states), and we have to keep X and Y in case we need more variables. Going back to the first half of the alphabet, A, F, and G are all different kinds of free energies (the last named for Gibbs). B is a virial coefficient or a magnetic field. I will be used as a symbol for information; J and L are angular momenta. K is Kelvin, which is the proper unit of T. M is magnetization, and N is a number, possibly Avogadro's, and O is too easily confused with 0. This leaves S . . .'' and H. In Spikes they also eliminate H (e.g., as the Hamiltonian). I, on the other hand, along with Shannon and others, prefer to honor Hartley. Thus, H for entropy . . .

The Gibbs inequality Top

We ought to note several things. Thermodynamics <-

``A theory is the more impressive the greater the simplicity of its premises is, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression which classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced that, within the framework of the applicability of its basic concepts, it will never be overthrown (for the special attention of those who are skeptics on principle).''

- A. Einstein, 1946

``Thermodynamics would hardly exist as a profitable discipline if it were not that the natural limit to the size of so many types of instruments which we now make in the laboratory falls in the region in which the measurements are still smooth.''

- P. W. Bridgman, 1941

A simple physical example (gases) Top

There are several ways to think about this example.

Language, and putting things together <-

``An essential distinction between language and experience is that language separates out from the living matrix little bundles and freezes them; in doing this it produces something totally unlike experience, but nevertheless useful.''

- P. W. Bridgman, 1936

``One is led to a new notion of unbroken wholeness which denies the classical analyzability of the world into separately and independently existing parts. The inseparable quantum interconnectedness of the whole universe is the fundamental reality.''

- David Bohm

Shannon's communication theory Top

Communication_Model.ps

Shannon's communication model

Tools <-

``It is a recurring experience of scientific progress that what was yesterday an object of study, of interest in its own right, becomes today something to be taken for granted, something understood and reliable, something known and familiar - a tool for further research and discovery.''

-J. R. Oppenheimer, 1953

``Nature uses only the longest threads to weave her patterns, so that each small piece of her fabric reveals the organization of the entire tapestry.''

- Richard Feynman

Application to Biology (analyzing genomes) Top

ecoli+random_6561_81.ps

Entropy of E. coli and random

window 6561, slide-step 81

ecoli_6561_81_fft.ps

Fourier transform of E. coli

window 6561, slide-step 81

random_6561_81_fft.ps

Fourier transform of random

window 6561, slide-step 81

Application to Physics (lasers) Top

Some other measures Top

Some additional material Top

What follows are some additional examples, and expanded discussion of some topics ...

Examples using Bayes' Theorem Top

Analog channels Top

Top

References

[1]
Brillouin, L., Science and information theory Academic Press, New York, 1956.

[2]
Brooks, Daniel R., and Wiley, E. O., Evolution as Entropy, Toward a Unified Theory of Biology, Second Edition, University of Chicago Press, Chicago, 1988.

[3]
Campbell, Jeremy, Grammatical Man, Information, Entropy, Language, and Life, Simon and Schuster, New York, 1982.

[4]
Cover, T. M., and Thomas J. A., Elements of Information Theory, John Wiley and Sons, New York, 1991.

[5]
DeLillo, Don, White Noise, Viking/Penguin, New York, 1984.

[6]
Feller, W., An Introduction to Probability Theory and Its Applications, Wiley, New York,1957.

[7]
Feynman, Richard, Feynman lectures on computation, Addison-Wesley, Reading, 1996.

[8]
Gatlin, L. L., Information Theory and the Living System, Columbia University Press, New York, 1972.

[9]
Haken, Hermann, Information and Self-Organization, a Macroscopic Approach to Complex Systems, Springer-Verlag, Berlin/New York, 1988.

[10]
Hamming, R. W., Error detecting and error correcting codes, Bell Syst. Tech. J. 29 147, 1950.

[11]
Hamming, R. W., Coding and information theory, 2nd ed, Prentice-Hall, Englewood Cliffs, 1986.

[12]
Hill, R., A first course in coding theory Clarendon Press, Oxford, 1986.

[13]
Hodges, A., Alan Turing: the enigma Vintage, London, 1983.

[14]
Hofstadter, Douglas R., Metamagical Themas: Questing for the Essence of Mind and Pattern, Basic Books, New York, 1985

[15]
Jones, D. S., Elementary information theory Clarendon Press, Oxford, 1979.

[16]
Knuth, Eldon L., Introduction to Statistical Thermodynamics, McGraw-Hill, New York, 1966.

[17]
Landauer, R., Information is physical, Phys. Today, May 1991 23-29.

[18]
Landauer, R., The physical nature of information, Phys. Lett. A, 217 188, 1996.

[19]
van Lint, J. H., Coding Theory, Springer-Verlag, New York/Berlin, 1982.

[20]
Lipton, R. J., Using DNA to solve NP-complete problems, Science, 268 542-545, Apr. 28, 1995.

[21]
MacWilliams, F. J., and Sloane, N. J. A., The theory of error correcting codes, Elsevier Science, Amsterdam, 1977.

[22]
Martin, N. F. G., and England, J. W., Mathematical Theory of Entropy, Addison-Wesley, Reading, 1981.

[23]
Maxwell, J. C., Theory of heat Longmans, Green and Co, London, 1871.

[24]
von Neumann, John, Probabilistic logic and the synthesis of reliable organisms from unreliable components, in automata studies( Shanon,McCarthy eds), 1956 .

[25]
Papadimitriou, C. H., Computational Complexity, Addison-Wesley, Reading, 1994.

[26]
Pierce, John R., An Introduction to Information Theory - Symbols, Signals and Noise, (second revised edition), Dover Publications, New York, 1980.

[27]
Roman, Steven, Introduction to Coding and Information Theory, Springer-Verlag, Berlin/New York, 1997.

[28]
Sampson, Jeffrey R., Adaptive Information Processing, an Introductory Survey, Springer-Verlag, Berlin/New York, 1976.

[29]
Schroeder, Manfred, Fractals, Chaos, Power Laws, Minutes from an Infinite Paradise, W. H. Freeman, New York, 1991.

[30]
Shannon, C. E., A mathematical theory of communication Bell Syst. Tech. J. 27 379; also p. 623, 1948.

[31]
Slepian, D., ed., Key papers in the development of information theory IEEE Press, New York, 1974.

[32]
Turing, A. M., On computable numbers, with an application to the Entscheidungsproblem, Proc. Lond. Math. Soc. Ser. 2 42, 230 ; see also Proc. Lond. Math. Soc. Ser. 2 43, 544, 1936.

[33]
Zurek, W. H., Thermodynamic cost of computation, algorithmic complexity and the information metric, Nature 341 119-124, 1989.

Back to top of file


File translated from TEX by TTH, version 2.25.
On 30 May 2002, 00:41.