He started with Claude Shannon considering the mathematical theory of communication; he then leapt backwards to jungle drums and the redundancy necessary to send information through a very noisy channel and quickly into writing, dictionaries, Babbage's Analytical Engine, telegraphy and telephony. This brought us back to Shannon who worked with Alan Turing, we considered bits and Turing Computers.
And then the book got more difficult and more magical. Because information theory seeped into biology. Genes are messages. In a sense life is all about information transfer. Richard Dawkins entered with the 'Selfish Gene': we are just vehicles for the transmission of genetic information. If the meaning of our lives is merely to transmit information then why does that have to be in the form of DNA? Richard Dawkins also invented the concept of the meme. Shakespeare may have had no great grandchildren and thus his genetic inheritance is dead but his memes have spread around the world: memetically he is the father of us all. And thus life has an alternative meaning to the passing on of genes: it could be about passing on your memes. Life is all about information transfer.
But so is the universe. Shannon realised the correspondence between information and entropy: the forms of the mathematical formulae are the same. As the second law of thermodynamics ('entropy increases') governs the direction of time, so the universe can be interpreted in terms of information. Perhaps, Gleick speculates, the observation that the universe is particulate, made of ever smaller individual particles, is because information is particulate with the irreducible fundamental particle of information being the bit.
I got lost when we came to quantum entanglement, qubits, and teleportation. Apparently the Einstein, Podolsky, Rosen paradox was true.
Then, after I had swum out of my depth, shallow water rescued me with a gallop through blogs and wikis and the sense that we are now being drowned by the information flood.
Wow! What a journey. A wonderful read.
April 2011; 426 pages
Gleick, J. 2011 The Information: A history, a theory, a flood London Fourth Estate
These are the notes I made as I travelled:
- "The Muses are the daughters of Mnemosyne" (p25)
- "The persistence of writing made it possible to impose structure on what was known about the world" (p36)
- On p53 he tells us that the writer of the first English dictionary, Cawdrey, wrote "wordes in one sentence and words in the next" on his title page but the facsimile of the title page on p52 uses 'words' only.
- The word spell "first meant to speak or to utter. Then it meant to read, slowly, letter by letter" (p53): a link with the concept of a magic spell?
- The concept of feedback, positive or negative, is quintessentially about information (p238)
- Once psychologists had cottoned onto the idea of information they could model perception as a channel carrying information from the outside world to the brain. They started measuring "the likelihood that listeners would hear a word correctly when they knew it was just one of a few alternatives" and "the effect of trying to understand two conversations at once" (p258)
- Whereas Physics works with Laws, molecular biology is understood in terms of algorithms (p299)
- Samuel Butler claimed a hen is an egg's way of making another egg (p302). Daniel Dennett in 1995 said 'A scholar is just a library's way of making another library' (p303)
- "The history of life begins with the accidental appearance of molecules complex enough to serve as building blocks - replicators. The replicator is an information carrier. It survives and spreads by copying itself. The copies must be coherent and reliable but need not be perfect; on the contrary, for evolution to proceed, errors must appear." Alexander Cairns-Smith suggested that, before DNA, "replicators appeared in sticky layers of clay crystals: complex models of silicate minerals." (p304)
- Ideas have 'spreading power', 'infectivity'. Ideas evolve. (p311) Dawkins meets Sperry meets Gladwin's Tipping Point. The infectivity of ideas is demonstrated by fashions and by viral videos. The evolution of ideas is demonstrated by Chinese Whispers (although this also shows that idea copying is so unreliable that it would be unlikely to lead to evolution in the biological sense).
- Greogry Chaitin defines randomness in terms of algorithm length. The longer the algorithm needed to generate a sequence the more random the sequence is (and the more information the sequence contains). "Looking for patterns - seeking the order amid chaos - is what scientists do, too." (p332) "This is what science always seeks: a simple theory that accounts for a large set of facts and allows for /// prediction of events still to come. It is the famous Occam's razor." (pp332-333)
- Matter falling into a black hole contains information. (p357) Hawking radiation has zero information. (p358). "If the black hole evaporates, where does the information go? According to quantum mechanics, information may never be destroyed" because otherwise the laws of Physics are not reversible in time on a microscopic scale (p358). Hawking initially thought that the information escaped into another universe (p358) but later conceded that this does not happen (p359) although I don't quite understand how he proved this.
- Information as entropy implies that thought requires energy although the thermodynamics of computation shows that the energy is only used up during erasure: "Forgetting takes work." (p362)
- "It remains difficult to know when and how much to trust the wisdom of crowds ... to be distinguished from the madness of crowds as chronicled in 1841 by Charles Mackay, who declared that people 'go mad in herds' .... Crowds turn all too quickly into mobs, with their time-honored manifestations: manias, bubbles, lynch mobs, flash mobs, crusades, mass hysteria, herd mentality, goose-stepping, conformity, groupthink - all potnetially magnified by network effects and studied under the rubric of information cascades." (p420)
- In 2008 Google's warning system for flu based on web searches for 'flu' "discovered outbreaks a week sooner than the Centers for Disease Control and Prevention" (p421)
Other great books in this area include:
Other books not reviewed on this blog on this topic include:
- Six degrees about small world networks by Duncan Watts
- sync by Steven Strogatz
- At Home in the Universe by Stuart Kauffman about fitness landscapes
- How Nature Works by Per Bak about sandpiles and self organized criticality; an excellent explanation of complexity science
- Deep Simplicity by John Gribbin which is a brilliant introduction to this whole field
- Smart swarm by Peter Miller
- The Information by James Gleick although his Chaos (not reviewed on this blog) is perhaps better
Other books not reviewed on this blog on this topic include:
- The Wisdom of Crowds
- Tipping Point by Malcolm Gladwell about fads
- Ubiquity which is brilliant about fractals and power laws
- Critical mass by Philip Ball which is a brilliant explanation about phase changes