By Yvette Leung ’14 & Saima Mir ’14, thurj Staff
Where would we be without computers, televisions, phones, or tracing further back in time, pens and papers? That is, where would we be without information? In his latest bestseller The Information: A History, A Theory, A Flood, James Gleick journeys through the eternal origins and explosive development of information, information science, and information technology. Our society, our lives, and our universe have all been characteristically embedded in symbols within a code long before sophisticated notions of codes were even developed. From the DNA that encodes our genetic information to the fundamental particles that make up nebulae in space, everything can be conceived as information processors. Looking into the distant past, the art of tonal African drumming unified a diverse continent of languages, simply by discerning different tones representing distinct messages. From this humble point on, the information revolution rapidly accelerated, driving the evolution of language from the oral to the written word.
At the very nucleus of the ever-sweeping information phenomena lies Claude Shannon’s information theory. Information, much like the physical concepts of force and energy, faced a rebirth, attempting to place the concept in non-semantic, concrete quantifiable grounds. He popularized the basic unit as the “bit”, or more technically, the binary digit. It is this small unit that becomes universally encompassing for our modern view—it can describe anything and give rise to everything. Gleick ties common conceptions in earlier communication developments to Shannon’s wartime cryptanalytic work. For example, in the development of African drum language or minimalist alphabet systems, redundancy (the repetition of information) was critical for comprehension. For Shannon, redundancy solved the problem of communicating over long distances, whereas boosting the power of signals led to noise buildup. Incorporating redundancy into the signals, rather than re-amplifying signals, solved this problem in a manner similar to the strategy employed by African drummers, where the drum’s messages are communicated not by louder drums but by redundant tonal drumbeats.
Contrary to the layman’s conception that the conveyance of meaning is the objective of communication, Gleick argues that, “the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point” (222). Gleick, however, is sure to cite the actual breadth of this view, since the conceptualization of communication was the turning point that allowed for the transmission of the spoken and written word as well as other visual and auditory elements, such as art and music. Seemingly impersonal communication and information technology swiftly became critical social constructs.
With a keen eye towards history, Gleick is able to weave biographical narratives of inventors such as Charles Babbage and Alan Turing, while continuing the ongoing thread of the role of information in the world and how its bits came to be. Gleick exposes the apparent paradox that “the written word—the persistent word—[is] a prerequisite for conscious thought as we understand it” (37). Although we easily may imagine rational thought to be required for written language, Gleick states just the opposite: that written language is essential in shaping our ability to understand and interpret information. Focusing on the written word, he further portrays the historical development of information as pertaining not to redundancy alone but also to the statistical structure of language and the development of the bit. Indeed, James Merrill made a strong point when he reduced the message “If you can read this message, you can get a good job with high pay” into “If u cn rd ths msg u cn gt a gd jb w hi pa,” accidentally stumbling upon an alternative to redundancy. Since the very idea of a bit is formulated from probabilities and the tradeoff between error correction and efficiency, Merrill demonstrated that redundancy could be stripped away to communicate more information at a faster speed. Translated into modern terms, less redundancy means more information and more bits transmitted at a lower price. In fact, much of information’s past boils down to a tension between redundancy and precision.
Gleick draws attention to the link between information theory and biological principles, using Schrodinger’s likening of the genetic code to the Morse code. With the discovery of a potential genetic code by Watson and Crick, Shannon’s coding became universal. To correct for interfering biological “noise,” the genetic code, like Shannon’s conceptions of noise-remedying, minimalist coding, was redundant. The pauses of the Morse Code could be equated with the stop codons of genes. Slowly, distinct discoveries began to come together, as information theory evolved from a strictly mathematical and engineering conception to a fundamental premise on which the world turns.
Leading us back to the present, Gleick concedes, perhaps to the disappointment of many, that “no form of digital
communication has earned more mockery than the service known as Twitter—banality shrink-wrapped” (419). Noting the emerging contention between the seemingly barren coldness of information theory and the demand for meaning of information, Gleick remarks, “Infinite possibility is good, not bad. Meaningless disorder is to be challenged, not feared” (419). Information theory, though independent from meaning by construction, is nonetheless tied to meaning. Expounding on the Twitter example, Gleick notes that what is trite can still be meaningful, evident from the firsthand accounts of social and political protests halfway around the world. Moreover, Google unites information theory and meaning, whose page ranks cut through the problem of information overload and banality.
At its finest, Gleick’s work imparts cultural weight to a technical field that is often taken for granted. The Information is not merely a contextualization of information science; it is a transformative view of the Information Age, leaving no dimension untouched of this ongoing era.