At a technology conference last year, Google’s outgoing CEO Eric Schmidt tried to put our current “information explosion” into historical perspective. Today, he said, we create as much information in 48 hours—five billion gigabytes worth—as was created “between the birth of the world and 2003.” It’s an astonishing comparison, and it seems to illuminate something important about the times we live in. But the harder you look at Schmidt’s numbers, the fuzzier they become. What does it mean to create information? When we measure information, what exactly are we measuring? What the heck is “information,” anyway?

None of those questions, it turns out, is easy to answer. Wikipedia isn’t much help. “As a concept,” it tells us, “information has many meanings,” which are “closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, and representation.” It might have been simpler to list the notions that information isn’t related to. Dictionaries are a little clearer. They suggest that information is more or less synonymous with knowledge. But that definition no longer seems sufficient. What does a gigabyte of knowledge look like? The fact is, although we live in an information age, we don’t really know what information even means.
Into the breach steps the gifted science writer James Gleick. In his formidable new book, The Information, Gleick explains how we’ve progressed from seeing information as the expression of human thought and emotion to looking at it as a commodity that can be processed, like wheat or plutonium. It’s a long, complicated, and important story, beginning with tribal drummers and ending with quantum physics, and in Gleick’s hands it’s also a mesmerizing one. Wisely, he avoids getting bogged down in the arcane formulas and equations of information theory—though (fair warning) there are quite a few of those—but rather situates his tale in the remarkable lives and discoveries of a series of brilliant mathematicians, logicians, and engineers.
There’s the eccentric English polymath Charles Babbage, who in the middle of the 19th century designed an elaborate calculating machine, the Analytical Engine, that anticipated the modern computer. There’s Countess Ada Lovelace Byron, the poet’s daughter, who, inspired by Babbage’s work, came up with the idea of the software algorithm. There’s the great philosopher-mathematician Bertrand Russell, who imagined that the language of mathematics would provide a perfect system of logic. And there’s the troubled Austrian theorist Kurt Gödel, who dismantled Russell’s dream by showing that mathematics is as prone to paradoxes and mysteries as any other language.
The star of Gleick’s story is a shy, gangly Midwesterner named Claude Shannon. As a boy growing up in a northern Michigan town in the 1920s, Shannon became obsessed with the mechanics of transmitting information. He turned a barbed-wire fence near his home into a makeshift telegraph system, using it to exchange coded messages with a friend a half mile away. After earning a doctorate from MIT, he joined Bell Labs as a researcher. In 1948, the same year that saw the invention of the transistor, Shannon published a groundbreaking monograph titled “A Mathematical Theory of Communication.” The paper was, as Gleick writes, “a fulcrum around which the world began to turn.”
Human beings, Shannon saw, communicate through codes—the strings of letters that form words and sentences, the dots and dashes of telegraph messages, the patterns of electrical waves flowing down telephone lines. Information is a logical arrangement of symbols, and those symbols, regardless of their meaning, can be translated into the symbols of mathematics. Building on that insight, Shannon showed that information can be quantified. He coined the term “bit”—indicating a single binary choice: yes or no, on or off, one or zero—as the fundamental unit of information. He realized, as well, that there is a great deal of redundant information—extraneous bits—in human communication. The message “Where are you?” can be boiled down to “whr r u?” and remain understandable to its recipient. Prune away the redundancy, through mathematical analysis, and you can transmit more information more quickly and at a much lower cost.
The impact of Shannon’s insights would be hard to overstate. They enabled phone companies to route more conversations through their wires, dramatically reducing the cost of communication and turning the telephone into a universal appliance. They paved the way for high-speed digital computers, software programming, mass data storage, and the Internet. Compression algorithms derived from Shannon’s work have become essential to modern media; they squeeze the music we listen to, the films we watch, the words we read. When you send a tweet, Google a keyword, or stream a Netflix movie, you are harvesting what Shannon sowed.
But information theory turned out to have applications far removed from communications systems. When, in the early 1950s, James Watson and Francis Crick discovered that genetic information was transmitted through a four-digit code—the nucleotide bases designated A, C, G, and T—biologists and geneticists began to draw on Shannon’s theory to decipher the secrets of life. Physicists, too, started to sense that the matter of the universe may be nothing more than the physical manifestation of information, that the most fundamental particles may be carriers and transmitters of messages. The bit, Gleick reports, could well turn out to be the basic unit of existence. The entire universe may be nothing more than “a cosmic information-processing machine.”
Gleick is at his best when he’s looking backward. As his story moves into the present day, it starts to sputter. Vignettes on memes, domain names, Wikipedia, and search engines feel scattered and familiar. And when he delves into the role of information theory in quantum mechanics, where qubits take the place of bits, his explanatory powers slacken. In describing what happens to information trapped in a black hole, Gleick refers to how one physicist “applied a formalism of quantum indeterminacy—the ‘sum of histories’ path integrals of Richard Feynman—to the very topology of spacetime and declared, in effect, that black holes are never unambiguously black.” Though I have faith that that means something, I can’t prove it.
As a celebration of human ingenuity, The Information is a deeply hopeful book. But it ends on an ambivalent note. The mathematical analysis of information, Gleick points out, entails the “ruthless sacrifice” of meaning, the very thing that “gives information its value and its purpose.” To the number-crunchers and code-wranglers who design our world-engirdling information networks, a message’s meaning is beside the point. A bit is a bit is a bit. As Shannon himself dryly noted, meaning is “irrelevant to the engineering problem.” We hear an echo of that idea in Eric Schmidt’s suggestion that centuries of culture can be compressed into a few billion gigabytes of data.
Even some of Shannon’s contemporaries expressed fears that his theories might end up warping our understanding of knowledge and creativity. The physicist Heinz von Foerster worried that, in separating meaning from message, Shannon risked reducing communication to a series of “beep beeps.” Information, he argued, can only be understood as a product of the human search for meaning—it resides not “in the beeps” but in the mind. Von Foerster’s warning is more important now than ever. The danger in taking a mathematical view of information, with its stress on maximizing the speed of communication, is that it encourages us to value efficiency over expressiveness, quantity over quality. What information theorists call redundancy, it’s worth remembering, is also the stuff of poetry.
Plus: Check out Book Beast, for more news on hot titles and authors and excerpts from the latest books.
Nicholas Carr is the author. most recently, of The Shallows: What the Internet Is Doing to Our Brains.