Derbtown

From Turing to Twitter

November 28, 2013

Multiple Pages
From Turing to Twitter

Ceruzzi, Paul E. Computing: A Concise History. Cambridge, MA; The MIT Press Essential Knowledge Series, 2012.

This time last year all I was hearing about was MOOCs—Massive Open Online Courses, in which university-level instruction, sometimes by big-name lecturers, is provided free over the Internet to anyone who wants it. Some visionaries were talking about MOOCs eventually bankrupting traditional universities.

Apparently that’s not going to happen. There is a niche for MOOCs, but it’s much smaller than advertised. The big players are still in there, though, and Massachusetts Institute of Technology is the biggest. MIT has been putting undergraduate courses online since 2002, and last year they partnered with Harvard, Berkeley, and other universities in a full-scale MOOC venture named edX.

The MIT Press Essential Knowledge Series of handbooks have a MOOCish look to them, and I’m guessing they’re meant as supplementary reading for MOOCs.

“Five years ago almost no one had heard of Twitter; now it seems to be a cornerstone of our civilization.”

That, I think, is the context for Paul Ceruzzi’s history of computing. The main topic is covered in a brisk 150 pages or so of plain text, illustrated with a scattering of black-and-white photographs and diagrams. Even this book’s cover is all in black, white, and gray, though other volumes in the series are a tad more colorfully bound. The book, which comes in paperback and e-book, is modestly priced.

The problem for anyone writing a history of computing is, as Ceruzzi says, that: “New developments transform the field while one is writing, thus rendering obsolete any attempt to construct a coherent narrative.” Five years ago almost no one had heard of Twitter; now it seems to be a cornerstone of our civilization.

Ceruzzi tackles this problem by organizing his material around four big themes that have dominated the history of computing so far and should, he says, be manifest in future developments.

The first big theme is the digital paradigm. We forget how new this is. Within living memory the boldest new technologies were all analog: radio, movies, vinyl disks, the slide rule, then radar and TV. Digital information—Morse code, the abacus—was old hat. The unexpected triumph of the digital paradigm across the past seven decades is a key part of this story.

Second is the convergence of different technologies for communication, calculation, data storage, and the control of operations. Smart phones, in which have converged the camera, phonograph, computer, radio, and TV, show convergence at its most dramatic, but the principle was present in the earliest computers, which could both store data like a book and manipulate it like a calculator.

Ceruzzi’s third theme is the accelerating sophistication of underlying technologies, encapsulated in Moore’s Law: The storage capacity of computer memory chips doubles every eighteen months.

The first computer I ever worked with was a LEO 326 owned by Britain’s telephone monopoly. LEO was a bungalow-size behemoth with 32 kilobytes of memory. That was in 1969, which is 29 Moore’s Law cycles ago; so applying the law, an equivalent machine today should sport 17 terabytes of memory—not far off for a comparable installation. Other technological indices—processor speed, channel capacity, price—show similar trends (inverse, in the case of price).

Fourth and last of these themes is what we now call the “user interface”: the way we interact with our digital, convergent, ever more sophisticated gadgets. This theme opens up into a broad terrain of speculation, at the far borders of which dwell the prophets of the singularity, an imagined point of future time at which the gadgets replace us altogether.

Ceruzzi resists the temptation to speculate. He sticks to the history, showing how user interface issues have been with us from the beginning—in the design of antiaircraft guns during World War II, for instance.

An engineer’s job was not finished once a machine was designed; he or she then had to fit that device into a human context—in that case, of newly recruited soldiers and sailors who had little background in advanced technology yet who were being asked to operate sophisticated radar and other electronic devices.

Recent developments, down to Facebook and, yes, Twitter, are adequately covered, with frequent reminders of continuity in those four major themes. Why, for example, are some websites more successful than others? Simplicity and lack of clutter, says Ceruzzi, as illustrated by the Google search screen.

Two other Web sites [sic] that consistently rank among the most visited, Wikipedia and Craigslist, also have a text-oriented design with few frills….These designs may seem a long way from the human factors work done during World War II on antiaircraft fire control devices, but they are the spiritual descendants.

The author also steers clear of the longstanding but petty controversy about which was the first modern electronic computer. The closest he gets is a reminder, when discussing the 1946 ENIAC machine, that the “C” in that acronym stands for “computer,” a word that up to that point (and still in the dictionaries of my childhood) referred to a human being—usually female—employed to carry out repetitive computations on mechanical adding machines.

Computing: A Concise History is neither more nor less than what it promises: an outline handbook on its topic, comprehensive yet parsimonious. The only fault I can find is somewhat of a bias toward hardware as against software. The author might have squeezed in a little more on the history of programming languages. It is possible, though, that my own bias as an old programmer (first language: ALGOL) is showing here.
I also regret that there is no mention of the Leo 326, a noble machine in its time and a triumph of British engineering. It was the last of a line originated by forward-looking managers in, of all places, the Lyons chain of tea shops. The assembler language was named Intercode, a cause for much ribald humor among programmers. The processor included a speaker so that you could hear your code executing. The operating system was named GEORGE. You never forget your first.

SIGN UP
Daily updates with TM’s latest


Comments



The opinions of our commenters do not necessarily represent the opinions of Taki's Magazine or its contributors.