In 2001, I ate lunch with Doug Engelbart—the guy who invented the mouse. Too bad I didn’t know anything about him at the time.
I was attending the annual meeting of my professional association, the Society for the History of Technology, in Pasadena, California. Turned out that Mr. Engelbart was an invited guest speaker at the conference, and he planned to speak later in the day when I first encountered him. I spied this person whom I didn’t know (my society is small, and I know most of the folks who come to the meetings), and he was seated at a table by himself eating lunch. I felt badly that he didn’t have any company, so I sat across from him and started talking. He was quite friendly and modest, and while I asked him about himself, he didn’t tell me much. Only later did I realize that I had been talking to a legend in new media history.
Had I known more, perhaps I would have asked him if he believed whether he had realized his goal of making life less complex through the use of computers. To be sure, he devised technologies and techniques that made it much easier for the average Joe (and even the unaverage Josephine who works professionally with computers) to interact with machines and to create new tools. Even in the world of high-tech, have his tools made things less complex?
One can certainly point to examples in which his vision has come true, such as the devices we use to retrieve information and do repeated tasks. And of course, who would give up his or her word processor (with the graphical interface we now take for granted) for a typewriter?
But in some cases, the use of computers has made people think they can do complex things more easily when, in fact, they can’t. Consider technologies such as nuclear power plants, which are inherently complex and which sociologist Charles Perrow says is one of several technologies that are essentially unknowable. Worse than that, they cannot be designed to avoid having accidents. In fact, they are destined to have, what he calls, “normal accidents.”
Normal accidents occur in systems in which “the parts are highly interactive, or ‘tightly coupled,’ and the interaction amplifies the effects in incomprehensive, unpredictable, unanticipated, and unpreventable ways.” (Charles Perrow, “Normal Accident at Three Mile Island,” Society 18, no. 5 (1981): 17-26; also see Charles Perrow, Normal Accidents: Living with High-Risk Technologies [New York: Basic Books, 1984].) He argues that no human or computer can anticipate all the interactions that can possibly occur in such a system, leading to inevitable accidents. Many of these accidents already have occurred, and some with tragic consequences, such as at the Three Mile Island nuclear power plant in 1979, at the Bhopal petrochemical plant in 1984, in electric power systems (which collapsed in parts of the US in 1965, 1971, and 2003), and so on.
While one can quibble with some of Perrow’s arguments, he suggests persuasively (in my mind, at least) that no matter how one may try, it’s not likely that humans can understand the consequences of every interaction of a large number of components in a system. Even the fanciest computer needs to be programmed by a human being, and that human can’t imagine every way in which a physical system’s components may intensify a mistake or defeat the best efforts of a human operator.
So, Doug, I wish I knew what I do now about your work so we could have had a more engaging talk 13 years ago. I take the blame for my ignorance. Sorry about that. Let’s hope, though, that when I eat lunch with Charles Perrow next time, I’ll be able to ask him about whether he thinks your work has made technology less complex and less prone to screw up.