I just read the New Yorker’s article titled “Claude Shannon, the Father of the Information Age, Turns 1100100.” The evocation of Shannon’s career took me back decades ago to the 20th century’s last quarter. It was the academic year 1967-1968. And I was an eighteen-year old freshman student at the Institut Polytechnique de Conakry (Guinea). I was in Propedeutics, which, then, designated the first of the four-year university system. Newbies belonged in three categories:
- Propedeutics A (Maths, Physics)
- Propedeutics B (experimental sciences: chemistry, biology)
- Propedeutics C (literature, linguistics, humanities)
Based on the baccalaureate transcripts (Série A) from my high-school of Labe (Fuuta-Jalon), I was automatically placed in Propedeutics C. There I took the class taught by Belgian professor of linguistics, Ms. Claire Van Arenberg. She brilliantly exposed our young minds to Claude Shannon’s concepts and some of their implications. It was all theoretical, of course. However, her explanations registered front, center and back in my mind. And they stuck in there, never dimming or fading out.
Fast forward some 15 years later, to January 22, 1982. I arrived at JFK International Airport aboard the regular PanAm flight Dakar-New York. I was on my way to the University of Texas at Austin, as an assistant-professor and a recipient of a Fulbright-Hayes Research Fellowship in sociolinguistics. Upon settling down in the heart of the Lone Star State, my first shopping trophy was a tablet-size Sinclair 64kb RAM computer. It was a disappointment. So, I quickly returned it. Overlooking Radio Shack’s Tandy desktop computer, I purchased a 128K Apple IIc, with external monitor. I connected it to a dot-matrix printer and 9.6 kbit/s modem. The two peripherals fetched hundreds of dollars. But, to me, they were worth their high price. For in Conakry, I had toiled for years as a co-publisher of Guinea’s journal, Miriya, Revue des sciences économiques et sociales. Preparation of each issue was a real pain. Armed with a typewriter, a pair of scissors and a glue container, we had to literally cut and paste words and letters during the pre and post-print phases. Consequently, the minute I saw a full screen word processor in action in Austin, I was sold. Today, while I no longer have the peripheral devices, I still own the Mac IIc with its AppleWorks OS and its staple applications software (word processing, database, spreadsheet). And I can still turn it on and run it…
I find it fascinating that Shannon’s fundamental concept, the bit, belongs also in every day English. Two words embed it: the nimble and the byte. The latter, too, predates the digital revolution since it was a currency in Medieval Europe.
The New Yorker‘s article pays tribute to Shannon’s creative genius. It unwittingly speaks for me. And it inherently expresses my intellectual debt and deep gratitude to the Father of the Information Age. It is a fitting salute from one of America’s premier journalistic and literary publications. I enjoyed reading it and I wholeheartedly second it.
Tierno S. Bah
the Father of the Information Age, Turns 1100100
Twelve years ago, Robert McEliece, a mathematician and engineer at Caltech, won the Claude E. Shannon Award, the highest honor in the field of information theory. During his acceptance lecture, at an international symposium in Chicago, he discussed the prize’s namesake, who died in 2001. Someday, McEliece imagined, many millennia in the future, the hundred-and-sixty-sixth edition of the Encyclopedia Galactica—a fictional compendium first conceived by Isaac Asimov—would contain the following biographical note:
Claude Shannon: Born on the planet Earth (Sol III) in the year 1916 A.D. Generally regarded as the father of the information age, he formulated the notion of channel capacity in 1948 A.D. Within several decades, mathematicians and engineers had devised practical ways to communicate reliably at data rates within one per cent of the Shannon limit.
As is sometimes the case with encyclopedias, the crisply worded entry didn’t quite do justice to its subject’s legacy. That humdrum phrase—“channel capacity”—refers to the maximum rate at which data can travel through a given medium without losing integrity. The Shannon limit, as it came to be known, is different for telephone wires than for fibre-optic cables, and, like absolute zero or the speed of light, it is devilishly hard to reach in the real world. But providing a means to compute this limit was perhaps the lesser of Shannon’s great breakthroughs. First and foremost, he introduced the notion that information could be quantified at all. In “A Mathematical Theory of Communication,” his legendary paper from 1948, Shannon proposed that data should be measured in bits—discrete values of zero or one. (He gave credit for the word’s invention to his colleague John Tukey, at what was then Bell Telephone Laboratories, who coined it as a contraction of the phrase “binary digit.”)
“It would be cheesy to compare him to Einstein,” James Gleick, the author of “The Information,” told me, before submitting to temptation. “Einstein looms large, and rightly so. But we’re not living in the relativity age, we’re living in the information age. It’s Shannon whose fingerprints are on every electronic device we own, every computer screen we gaze into, every means of digital communication. He’s one of these people who so transform the world that, after the transformation, the old world is forgotten.” That old world, Gleick said, treated information as “vague and unimportant,” as something to be relegated to “an information desk at the library.” The new world, Shannon’s world, exalted information; information was everywhere. “He created a whole field from scratch, from the brow of Zeus,” David Forney, an electrical engineer and adjunct professor at M.I.T., said. Almost immediately, the bit became a sensation: scientists tried to measure birdsong with bits, and human speech, and nerve impulses. (In 1956, Shannon wrote a disapproving editorial about this phenomenon, called “The Bandwagon.”)
Although Shannon worked largely with analog technology, he also has some claim as the father of the digital age, whose ancestral ideas date back not only to his 1948 paper but also to his master’s thesis, published a decade earlier. The thesis melded George Boole’s nineteenth-century Boolean algebra (based on the variables true and false, denoted by the binary one and zero) with the relays and switches of electronic circuitry. The computer scientist and sometime historian Herman Goldstine hyperbolically deemed it “one of the most important master’s theses ever written,” arguing that “it changed circuit design from an art to a science.” Neil Sloane, a retired Bell Labs mathematician as well as the co-editor of Shannon’s collected papers and the founder of the On-Line Encyclopedia of Integer Sequences, agreed. “Of course, Shannon’s main work was in communication theory, without which we would still be waiting for telegrams,” Sloane said. But circuit design, he added, seemed to be Shannon’s great love. “He loved little machines. He loved the tinkering.”
For instance, Shannon built a machine that did arithmetic with Roman numerals, naming it THROBAC I, for Thrifty Roman-Numeral Backward-Looking Computer. He built a flame-throwing trumpet and a rocket-powered Frisbee. He built a chess-playing automaton that, after its opponent moved, made witty remarks. Inspired by the late artificial-intelligence pioneer Marvin Minsky, he designed what was dubbed the Ultimate Machine: flick the switch to “On” and a box opens up; out comes a mechanical hand, which flicks the switch back to “Off” and retreats inside the box. Shannon’s home, in Winchester, Massachusetts (Entropy House, he called it), was full of his gizmos, and his garage contained at least thirty idiosyncratic unicycles—one without pedals, one with a square tire, and a particularly confounding unicycle built for two. Among the questions he sought to answer was, What’s the smallest unicycle anybody could ride? “He had a few that were a little too small,” Elwyn Berlekamp, a professor emeritus of mathematics at Berkeley and a co-author of Shannon’s last paper, told me. Shannon sat on Berlekamp’s thesis committee at M.I.T., and in return he asked Berlekamp to teach him how to juggle with four balls. “He claimed his hands were too small, which was true—they were smaller than most people’s—so he had trouble holding the four balls to start,” Berlekamp said. But Shannon succeeded in mastering the technique, and he pursued further investigations with his Jugglometer. “He was hacking reality,” the digital philosopher Amber Case said.
By 1960, however, like the hand of that sly machine, Shannon had retreated. He no longer participated much in the field that he had created, publishing only rarely. Yet he still tinkered, in the time he might have spent cultivating the big reputation that scientists of his stature tend to seek. In 1973, the Institute of Electrical and Electronics Engineers christened the Shannon Award by bestowing it on the man himself, at the International Symposium on Information Theory in Ashkelon, Israel. Shannon had a bad case of nerves, but he pulled himself together and delivered a fine lecture on feedback, then dropped off the scene again. In 1985, at the International Symposium in Brighton, England, the Shannon Award went to the University of Southern California’s Solomon Golomb. As the story goes, Golomb began his lecture by recounting a terrifying nightmare from the night before: he’d dreamed that he was about deliver his presentation, and who should turn up in the front row but Claude Shannon. And then, there before Golomb in the flesh, and in the front row, was Shannon. His reappearance (including a bit of juggling at the banquet) was the talk of the symposium, but he never attended again.
The New Yorker