In 1964 Jim Marshall, a defensive player for the Minnesota Vikings, committed a mistake by recovering a fumble by the San Francisco 49ers and running it into the end zone – at the wrong end of the field. In the early 1990s the US Congress made a mistake by ordering continued development of the Osprey VTOL aircraft. Did these two “actors” do the same sort of thing? Is an organization’s mistake similar to an individual’s mistake? At a superficial level it is easy enough to agree that these are the same kinds of things. The wrong outcome resulted from a series of apparently intentional and calculated actions. But a closer look makes it clear that they are not so similar after all.
Making a mistake by an individual arises in situations of quasi-rational actors deciding on an action based on the consequences the actor hopes to bring about. The actor is “intentional” — he or she has a plan for bringing about a desired consequence or benefit, has calculated a sequence of actions designed to achieve the goal, and has estimated the circumstances within which he/she acts over time. A mistake happens when the actor miscalculates something that he/she should have correctly calculated — the way one action can be expected to lead to an intermediate outcome, the features of the environment in which the action is to be carried out, the predictable events that might interfere with the sequence of actions and their intended outcome. Miscalculation is the essence of individual mistakes. The actor is a unified perceiver and observer who chooses a sequence of actions designed to achieve the goal, but miscalculates some part of the underlying assumptions guiding the action.
Is miscalculation the primary source of mistakes when a complex organization’s strategy goes awry? Sometimes. Lyndon Johnson miscalculated the goals and reasoning of Ho Chi Minh and escalated US involvement in the Vietnam War. But the most interesting causes of organizational mistakes have little to do with miscalculation. The reason for this is that organizations, unlike individuals, are not unified perceivers, planners, and actors. Instead, organizations are loose configurations of lower-level actors who are only weakly coordinated by a single managing intelligence – a top level executive. Loose linkages across sub-units of an organization raise the possibility that each sub-unit is approximately rational, and yet the aggregate result of the complex interaction is quite different from what was intended by the key executive. In the case of the design of the Ford Pinto, the top corporate executive did not intend to release a vehicle design that endangered vehicle safety, and yet a series of loose linkages across units led to exactly that outcome.
Several key organizational dysfunctions have been identified that contribute to organizational mistakes … even though each sub-unit is acting rationally. Dysfunctions that have been discussed in earlier posts can all lead to organizational failures: principal-agent problems, conflicting cognitive frameworks, conflicting local priorities, external pressures on decision makers, poor communication and information-sharing. (A New Social Ontology of Government discusses these dysfunctions in greater detail.)
It is clear, then, that an organization’s mistakes are often quite different from the mistakes made by a reasonably rational individual. They often derive from dysfunctions that appear to be systemic in organizations, and from the important fact that organizations are unavoidably dis-unified. Intentions, information, belief formation, cognitive framing, coordination of underlying assumptions all depend on separate teams of decision makers and actors, and large organizations often miss the mark with their decision processes precisely because of this fact. Sources of bad collective or corporate decisions include problems of conflicting priorities and interpretations of the action environment, principal-agent problems, imperfect communication and information sharing, slow “updating” of knowledge of the action environment, and unintended consequences of one line of action that interfere with other actions. In the end the organization fails to accomplish its action goal, and from the outside it looks like a series of incomprehensible blunders.
The public diagnosis of governmental and corporate "mistakes" is often a simple one: “Mistakes were made”, with the implication that more intelligent or experienced managers would have been more successful. But this impression is often mistaken. Intelligent people in different parts of the organization made resourceful and resilient efforts to carry out their part of the plan. And yet the compound of these sub-actions is something that turns out to be stunningly ineffective. Dien Bien Phu was a military disaster for the French army in Indochina. And yet there were reasons for each intermediate decision that led to the eventual debacle.
This suggests that citizens and policy makers need to think about organizational errors differently from mistakes made by individuals. Organizations need to be more “disaster-resistant”, so that the dysfunctions mentioned here have less likelihood of resulting in a catastrophic failure. “Be more careful” is not useful advice. Instead, organizational designers and leaders need to take specific measures to soften the potential impact of information failure, conflicting cognitive frames, and conflicting priorities in different parts of the organization. Redundancy is one potential source of resilience. Better training in procedures and cognitive frameworks is another. (For example, accidents have occurred in nuclear fuel processing plants because workers were not taught about the importance of the geometry of holding vessels on the critical mass of liquids with dissolved radioactive materials; Atomic Accidents.) And we need to bear in mind always that the loose linkages and weak forms of intentionality that are unavoidable features of large organizations pose permanent risks for effective organizational action.