November 28, 2013

LEO 326

LEO 326

Ceruzzi, Paul E. Computing: A Concise History. Cambridge, MA; The MIT Press Essential Knowledge Series, 2012.

This time last year all I was hearing about was MOOCs”€”Massive Open Online Courses, in which university-level instruction, sometimes by big-name lecturers, is provided free over the Internet to anyone who wants it. Some visionaries were talking about MOOCs eventually bankrupting traditional universities.

Apparently that’s not going to happen. There is a niche for MOOCs, but it’s much smaller than advertised. The big players are still in there, though, and Massachusetts Institute of Technology is the biggest. MIT has been putting undergraduate courses online since 2002, and last year they partnered with Harvard, Berkeley, and other universities in a full-scale MOOC venture named edX.

The MIT Press Essential Knowledge Series of handbooks have a MOOCish look to them, and I’m guessing they’re meant as supplementary reading for MOOCs.

“€œFive years ago almost no one had heard of Twitter; now it seems to be a cornerstone of our civilization.”€

That, I think, is the context for Paul Ceruzzi’s history of computing. The main topic is covered in a brisk 150 pages or so of plain text, illustrated with a scattering of black-and-white photographs and diagrams. Even this book’s cover is all in black, white, and gray, though other volumes in the series are a tad more colorfully bound. The book, which comes in paperback and e-book, is modestly priced.

The problem for anyone writing a history of computing is, as Ceruzzi says, that: “New developments transform the field while one is writing, thus rendering obsolete any attempt to construct a coherent narrative.” Five years ago almost no one had heard of Twitter; now it seems to be a cornerstone of our civilization.

Ceruzzi tackles this problem by organizing his material around four big themes that have dominated the history of computing so far and should, he says, be manifest in future developments.

The first big theme is the digital paradigm. We forget how new this is. Within living memory the boldest new technologies were all analog: radio, movies, vinyl disks, the slide rule, then radar and TV. Digital information”€”Morse code, the abacus”€”was old hat. The unexpected triumph of the digital paradigm across the past seven decades is a key part of this story.

Second is the convergence of different technologies for communication, calculation, data storage, and the control of operations. Smart phones, in which have converged the camera, phonograph, computer, radio, and TV, show convergence at its most dramatic, but the principle was present in the earliest computers, which could both store data like a book and manipulate it like a calculator.

Ceruzzi’s third theme is the accelerating sophistication of underlying technologies, encapsulated in Moore’s Law: The storage capacity of computer memory chips doubles every eighteen months.

The first computer I ever worked with was a LEO 326 owned by Britain’s telephone monopoly. LEO was a bungalow-size behemoth with 32 kilobytes of memory. That was in 1969, which is 29 Moore’s Law cycles ago; so applying the law, an equivalent machine today should sport 17 terabytes of memory”€”not far off for a comparable installation. Other technological indices”€”processor speed, channel capacity, price”€”show similar trends (inverse, in the case of price).

Columnists

Sign Up to Receive Our Latest Updates!