Navigation page

Pages

Wednesday, March 26, 2008

Explaining technology failure

Technology failure is often spectacular and devastating -- witness Bhopal, Three Mile Island, Chernobyl, the Challenger disaster, and the DC10 failures of the 1970s. But in addition to being a particularly important cause of human suffering, technology failures are often very complicated social outcomes that involve a number of different kinds of factors. And this makes them interesting topics for social science study.

It is fairly common to attribute spectacular failures to a small number of causes -- for example, faulty design, operator error, or a conjunction of unfortunate but singly non-fatal accidents. What sociologists who have studied technology failures have been able to add is the fact that the root causes of disastrous failures can often be traced back to deficiencies of the social organizations in which they are designed, used, or controlled (Charles Perrow, Normal Accidents: Living with High-Risk Technologies). Technology failures are commonly the result of specific social organizational defects; so technology failure is often or usually a social outcome, not simply a technical or mechanical misadventure. (Dietrich Dorner's The Logic of Failure: Recognizing and Avoiding Error in Complex Situations is a fascinating treatment of a number of cases of failure; Eliot Cohen's Military Misfortunes: The Anatomy of Failure in War provides an equally interesting treatment of military failures; for example, the American failure to suppress submarine attacks on merchant shipping off the US coast in the early part of World War II.)

First, a few examples. The Challenger space shuttle was destroyed as a result of O-rings in the rocket booster units that became brittle because of the low launch temperature -- evidently an example of faulty design. But various observers have asked the more fundamental question: what features of the science-engineering-launch command process that was in place within NASA and between NASA and its aerospace suppliers led it to break down so profoundly (Diane Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA)? What organizational defects made it possible for this extended group of talented scientists and engineers to come to the decision to launch over the specific warnings that were brought forward by the rocket provider's team about the danger of a cold-temperature launch? Edward Tufte attributes the failure to poor scientific communication (Visual Explanations: Images and Quantities, Evidence and Narrative); Morton Thiokol engineer Roger Boisjoly attributes it to an excessively hierarchical and deferential relation between the engineers and the launch decision-makers. Either way, features of the NASA decision-making process -- social-organizational features -- played a critical role.

Bhopal represents another important case. Catastrophic failure of a Union Carbide pesticide plant in Bhopal, India in 1984 led to a release of a highly toxic gas. The toxic cloud passed into the densely populated city of Bhopal. Half a million people were affected, and between 16 and 30 thousand people died as a result. A chemical plant is a complex physical system. But even more, it is operated and maintained by a complex social organization, involving training, supervision, and operational assessment and oversight. In his careful case study of Bhopal, Paul Shrivastava maintains that this disaster was caused by a set of persistent and recurring organizational failures, especially in the areas of training and supervision of operators (Bhopal: Anatomy of Crisis).

Close studies of the nuclear disasters at Chernobyl and Three Mile Island have been equally fruitful in terms of shedding light on the characteristics of social, political, and business organization that have played a role in causing these great disasters. The stories are different in the two cases; but in each case, it turns out that social factors, including both organizational features internal to the nuclear plants and political features in the surrounding environment, played a role in the occurrence and eventual degree of destruction associated with the disasters.

These cases illustrate several important points. First, technology failures and disasters almost always involve a crucial social dimension -- in the form of the organizations and systems through which the technology is developed, deployed, and maintained and the larger social environment within which the technology is situated. Technology systems are social systems. Second, technology failures therefore constitute an important subject matter for sociological and organizational research. Sociologists can shed light on the ways in which a complex technology might fail. And third, and most importantly, the design of safe systems -- particularly systems that have the potential for creating great harms -- needs to be an interdisciplinary effort. The perspectives of sociologists and organizational theorists need to be incorporated as deeply as those of industrial and systems engineers into the design of systems that will preserve a high degree of safety. This is an important realization for the high profile risky industries -- aviation, chemicals, nuclear power. But it is also fundamental for other important social institutions, including especially hospitals and health systems. Safe technologies will only exist when they are embedded in safe, fault-tolerant organizations and institutions. And all of this means, in turn, that there is an urgent need for a sociology of safety.

No comments:

Post a Comment