'The Information' by James Gleick pieces together a narrative out of the history of humans and science to present the idea of 'Information' as it has come to be in the scientific community today and how it has been transforming the world.
The Information
James Gleick
Vintage Books - Random House
ISBN 978-1-4000-9623-7
If someone told you that the sun would rise in the east and set in the west tomorrow, you would just shrug at this trite fact. Why? Because it is inconsequential to you. But if somebody told you that a certain politician holding public office owned his mistake in public, apologized for it and against all expectations, set it right rather than passing the soiled baton to his successor, it might pique your curiosity. Why is the first uninteresting while the latter intriguing? Because the second 'news' is unprecedented; carries something that you didn't know. Of all that we know an honest politician might not exist, only honest politics does. Or simply, it contains more information for you.
While the layperson is often satisfied with imprecise answers - for example, "How hot is it? - Very hot! (Implication is in articulation, tone, and gesticulation)" - scientists are dissatisfied with the vagueness of these expressions. Many of them asked questions like - how much information is there in it? How does one measure information? And... what is information? A whole new branch of applied mathematics (engineering) named 'Information Theory' sprung forth to answer these questions - mostly due to the works of one person. That person is Claude Shannon. Soon, myriad other streams assimilated its concepts in their own ways and drew fresh breaths. The world has been transformed. 'The Information' by James Gleick pieces together a narrative out of the history of humans and science to present the idea of 'Information' as it has come to be in the scientific community today and how it has been transforming the world.
The idea of information is coupled closely with the question of its transmission. James Gleick starts off with what one might consider to being an obscure and primitive but robust method of communication - so primitive that the civilized world couldn't even make sense of it for a long time - talking drums of Africa. Though primitive, the talking drums had unknowingly reached into the abstract domain of communication that the civilized world of that time had not discovered - encoding of a spoken language into a readily transferable form while removing a lot of ambiguity.
From there, it is a tour-de-force through the ages - not chronologically though - from the cuneiform, sympathetic magnetic needles and the alphabet, to the world's first Dictionary, to the analytical engine of Charles Babbage, Telegraph, Boole's logic, Morse codes, Telephones, Shannon's Information Theory and Turing's Universal Machine, Kolmogorov and his study of Probability, Quantum cryptography and to the code that makes us what we are - the DNA. The focus is less on the persons and more on the events that led them to their contributions.
He attempts a herculean task as he tries to connect the dots through the ages. In doing so, he sheds some light into the kind of world that existed in those times and how technology sometimes took small steps, and sometimes giant leaps thanks to the work of a few people way ahead of their times. Technology that reached the masses is the technology that eventually evolved. But, owing to its acceptance, the technology changed the society itself. One of the first was the Printing Press, then came Telegraphy. While the first helped in the crystallization of the written language inadvertently and gave dictionaries the reputation and task they bear still today, it also caused the language to become more sternly rigid. When the Telegraph was introduced, people thought and claimed that it was the pinnacle of all advancements - so much so that the news agencies named themselves on the very word - The Telegraph. Telephone wiped out Telegraph disruptively and today, wireless communication and pocket-sized powerful computers which come packed as cell phones are doing so to their fixed-line counterparts. He talks about memes as an abstract sibling of the gene. He traces the origins of the 6-degrees of separation in an age-old meme that has kept reintroducing itself in different forms from time to time. He sees memes as a means to virality just like virality is a means for the memes to replicate - just like the genes are alleged to do through natural selection.
The book touches upon familiar as well as the unfamiliar personalities related by their works, dug up from annals for the readers to appreciate the massive collaboration through time and how giants have always stood upon the shoulders of those who came before. Boole built upon what Philosophers had made of logic, Shannon melded Boole's logic with electrical circuitry. He built upon Nyquist's work and Hartley's work and added more of his own life's work to it on which we stand and tease out more details.
It pays homage to Charles Babbage and his Differential Engine which - then inspired from Jacquard's loom - paved the foundation of the Analytical Engine (which was never built completely physically). How the prodigious Ada Lovelace delved mathematically into what would later be called 'algorithms' and 'programming'. Then, here comes John Wilkins (who later founded the Royal Society) who had a deep insight into the abstract. In him, one sees a beautiful blend of the religious and the mathematical.
..he expressed the problem (of communication) this way: "How a Man may with the greatest Swiftness and Speed, discover his Intentions to one that is far distant from him."..."There is nothing (we say) so swift as Thought," he noted. As a clergyman, he observed that the swiftest motion of all must belong to angels and spirits. If only a man could send an angel on an errand, he could dispatch business at any distance... No wonder, Wilkins wrote, that angels are called messengers. As a mathematician, he considered the problem from another side. He set out to determine how a restricted set of symbols - perhaps just two, three, or five - might be made to stand for a whole alphabet. They would have to be used in combination...Two symbols. In groups of five. "Yield thirty two Differences."... Wilkins was reaching for a conception of information in its purest, most general form. Writing was only a special case: "For in the general we must note, that whatever is capable of a competent Difference, perceptible to any Sense, may be a sufficient Means whereby to express the Cogitations." A difference could be "two Bells of different Notes"; or "any Object of Sight, whether Flame, Smoak, &c."; or trumpets, cannons, or drums. Any difference meant a binary choice. ...Here, in this arcane and anonymous treatise of 1641, the essential idea of information theory poked to the surface of human thought, saw its shadow, and disappeared again for three hundred years.
From a vantage point of many centuries in the future, we readily recognize them as binary arithmetic and while conceptually valid, we still struggle to study how the organic body does this messaging by such differences of various elements in the body itself. A legacy that we have barely touched!
Gleick explains how entropy was borrowed from Thermodynamics and applied to communication and how the two are deeply related to Maxwell's demon. This is something I had only heard before but never found out why.
He flirts with how various domains have taken note of Information Theory and applied it in new ways that Shannon himself was hesitant to apply. Biology took it in and redefined how organisms are created - the code of life - a new way to look at the DNA - not just chemical soup, but something abstracted out of the physical - information, algorithm, recipe - the Gene. As Sydney Brenner put, the study of such systems has to be an amalgamation of the physical and the abstract:
One would like to be able to fuse the two - to be able to move between the molecular hardware and the logical software of how it's all organized, without feeling they are different sciences.
He flirts with Dawkin's idea of DNA being the master and everything serving its ends - "They are in you and in me; they created us, body and mind; and their preservation is the ultimate rationale for our existence." Just like a hen being the egg's way of creating another egg or that "A scholar is the library's way of making another library". Information Theory and its implications moved John Archibald Wheeler enough to claim 'It from Bit'.
With clever writing, it is easy for the reader to believe in the narration (and hence, also fall for the narrative fallacy). So, when the author says of Shannon that "He had more than his share of playfulness, and as a child he had a large portion of loneliness too, which along with his tinkerer's ingenuity helped motivate his barbed-wire telegraph", there might be an element of sensationalization in it. How wordplay may affect the reader is better understood in contrasts. (I discuss one here.)
Today, the world is awash with bits. As we continue the march into future, we keep on adding tomes to the mythical Library of Babel - the library that "enshrines all the information. Yet no knowledge can be discovered here, precisely because all knowledge is there, shelved side by side with all falsehood. In the mirrored galleries, on the countless shelves, can be found everything and nothing. There can be no more perfect case of information glut". This is hardly undeniable as fake news and propaganda flood the internet. And yet, we are always looking for 'a bit more information'. So much to the exent that it has propagated another meme:
Never before has a generation so diligently recorded themselves accomplishing so little.
Despite this, he ends on an optimistic note. He puts faith in our knack for sustenance and meaning finding even when we are terrible at it.
I would like to think of Information Theory and the language of bits as another language, a systematic and causal development, not far removed from how other languages have evolved. A systematic vocabulary and grammar - a narrative which none of my Professors professed or have seemed to process. Through this book, I would have liked to understand my craft a little better and hoped to get better at the craft itself. It was with this intent that I had tried to find a book on the history of Information Theory. While not exactly that, this book was revelatory and transforming in many respects. For one, it reformed me on my stand on language.
Second, I am not very adept at the abstract. So, while I understand the value of 'x' in mathematics, I find it useless to deal with it for the sake of finding 'x'. I believe that I am more of a visualizer. So, I would rather take the abstract into a physical or a conceptual realm where it applies, watch it work and then return it back to the abstract. This is also how I tend to approach problems (and think most people do) - map the problem of a different field into a field we are comfortable with, solve it, and place it back into its original domain. This is what (I think) leads to metaphors and comparative examples. But this book helps to see why dealing explicitly in the abstract can free us from the cruft of the physical.
Then come Morgan and Boole who invigorate the field of logic, so far the dominion of philosophers. "Until now logic had belonged to philosophy. Boole was claiming possession on behalf of mathematics. In doing so, he devised a new form of encoding."
The symbols were like little capsules, protecting delicate cargo from the wind and fog of everyday communication. How much safer to write:
1-x = y(1-z) + z(1-y) + (1-y)(1-z)
than the real-language proposition for which, in a typical Boolean example, it stood:
Unclean beasts are all which divide the hoof without chewing the cud, all which chew the cud without dividing the hoof, and all which neither divide the hoof nor chew the cud.
The book offers a very succinct roundup of a field that took the world by storm. It is well written. I would not assume it to be an easy read for the uninitiated - particularly the aspects regarding Shannon's Information Theory and the theory of Random numbers. While James Gleick has made an effort to break it down into simple terms, the background needed to understand the concepts after a certain point need more academic groundwork. For those already in the field, the book serves as a wonderful family history.