Computing

Paul E. Ceruzzi

With subplots featuring IBM, Microsoft, Apple, Facebook, and Twitter, the history of computing may be recounted as the tale of hardware and software, the Internet, or the history of “smart” handheld devices. Computer historian Paul Ceruzzi presents a broader and more helpful perspective in this condensed and approachable overview of the invention and advancement of digital technology. He identifies four main themes that have permeated the entire technological development of computing: digitization, which is the encoding of information, computation, and control in binary form, ones and zeros; convergence of various streams of techniques, devices, and machines, which produces more than the sum of its parts; Moore’s Law, which describes the steady advancement of electronic technology; and the human-machine interface.

Ceruzzi walks us through the development of the punch card and explains how a Bell Labs mathematician created the term “digital” in 1942 to describe a quick calculation technique used in anti-aircraft weaponry (for use in 1890 US Census). He talks about the UNIVAC, the first general-purpose computer, the ARPANET, the forerunner to the Internet, and the ENIAC, designed for scientific and military uses. The world-changing development of the computer is traced in Ceruzzi’s story from a room-sized collection of machinery to a “minicomputer” to a desktop computer to a smartphone that fits in your pocket. He talks about how silicon chips allowed for ever-decreasing gadget sizes and the ability to hold ever-increasing amounts of data. He travels to Silicon Valley, the epicenter of invention, and updates the narrative with references to the Internet, the World Wide Web, and social networking.