Monday, December 18, 2023

Mistakes by organizations


In 1964 Jim Marshall, a defensive player for the Minnesota Vikings, committed a mistake by recovering a fumble by the San Francisco 49ers and running it into the end zone – at the wrong end of the field. In the early 1990s the US Congress made a mistake by ordering continued development of the Osprey VTOL aircraft. Did these two “actors” do the same sort of thing? Is an organization’s mistake similar to an individual’s mistake? At a superficial level it is easy enough to agree that these are the same kinds of things. The wrong outcome resulted from a series of apparently intentional and calculated actions. But a closer look makes it clear that they are not so similar after all.

Making a mistake by an individual arises in situations of quasi-rational actors deciding on an action based on the consequences the actor hopes to bring about. The actor is “intentional” — he or she has a plan for bringing about a desired consequence or benefit, has calculated a sequence of actions designed to achieve the goal, and has estimated the circumstances within which he/she acts over time. A mistake happens when the actor miscalculates something that he/she should have correctly calculated — the way one action can be expected to lead to an intermediate outcome, the features of the environment in which the action is to be carried out, the predictable events that might interfere with the sequence of actions and their intended outcome. Miscalculation is the essence of individual mistakes. The actor is a unified perceiver and observer who chooses a sequence of actions designed to achieve the goal, but miscalculates some part of the underlying assumptions guiding the action.

Is miscalculation the primary source of mistakes when a complex organization’s strategy goes awry? Sometimes. Lyndon Johnson miscalculated the goals and reasoning of Ho Chi Minh and escalated US involvement in the Vietnam War. But the most interesting causes of organizational mistakes have little to do with miscalculation. The reason for this is that organizations, unlike individuals, are not unified perceivers, planners, and actors. Instead, organizations are loose configurations of lower-level actors who are only weakly coordinated by a single managing intelligence – a top level executive. Loose linkages across sub-units of an organization raise the possibility that each sub-unit is approximately rational, and yet the aggregate result of the complex interaction is quite different from what was intended by the key executive. In the case of the design of the Ford Pinto, the top corporate executive did not intend to release a vehicle design that endangered vehicle safety, and yet a series of loose linkages across units led to exactly that outcome.

Several key organizational dysfunctions have been identified that contribute to organizational mistakes … even though each sub-unit is acting rationally. Dysfunctions that have been discussed in earlier posts can all lead to organizational failures: principal-agent problems, conflicting cognitive frameworks, conflicting local priorities, external pressures on decision makers, poor communication and information-sharing. (A New Social Ontology of Government discusses these dysfunctions in greater detail.)

It is clear, then, that an organization’s mistakes are often quite different from the mistakes made by a reasonably rational individual. They often derive from dysfunctions that appear to be systemic in organizations, and from the important fact that organizations are unavoidably dis-unified. Intentions, information, belief formation, cognitive framing, coordination of underlying assumptions all depend on separate teams of decision makers and actors, and large organizations often miss the mark with their decision processes precisely because of this fact. Sources of bad collective or corporate decisions include problems of conflicting priorities and interpretations of the action environment, principal-agent problems, imperfect communication and information sharing, slow “updating” of knowledge of the action environment, and unintended consequences of one line of action that interfere with other actions. In the end the organization fails to accomplish its action goal, and from the outside it looks like a series of incomprehensible blunders.

The public diagnosis of governmental and corporate "mistakes" is often a simple one: “Mistakes were made”, with the implication that more intelligent or experienced managers would have been more successful. But this impression is often mistaken. Intelligent people in different parts of the organization made resourceful and resilient efforts to carry out their part of the plan. And yet the compound of these sub-actions is something that turns out to be stunningly ineffective. Dien Bien Phu was a military disaster for the French army in Indochina. And yet there were reasons for each intermediate decision that led to the eventual debacle.

This suggests that citizens and policy makers need to think about organizational errors differently from mistakes made by individuals. Organizations need to be more “disaster-resistant”, so that the dysfunctions mentioned here have less likelihood of resulting in a catastrophic failure. “Be more careful” is not useful advice. Instead, organizational designers and leaders need to take specific measures to soften the potential impact of information failure, conflicting cognitive frames, and conflicting priorities in different parts of the organization. Redundancy is one potential source of resilience. Better training in procedures and cognitive frameworks is another. (For example, accidents have occurred in nuclear fuel processing plants because workers were not taught about the importance of the geometry of holding vessels on the critical mass of liquids with dissolved radioactive materials; Atomic Accidents.) And we need to bear in mind always that the loose linkages and weak forms of intentionality that are unavoidable features of large organizations pose permanent risks for effective organizational action.


3 comments:

Paul D. Van Pelt said...

Well, seems to me the fumble recovery was a momentary loss of responsive consciousness, or, a dumb move, in the heat of the moment. The VTOL was more complex.
Large sums of money are problematic because of Interests, Motives, and Preferences...
IMPs. Military industrialism entails lots of money and successful projects beget more of the same. The technology behind the VTOL had some issues with aerodynamics, and, probably, thermodynamics as well. Certain IMPish powers, no doubt, had and exerted influence over development of the aircraft. A lot was riding on the Osprey. The people riding in it were, as usual, expendables. People generally are.

Marcelo said...

I wonder if "mistakes" might just not be applicable to organizations then. Using the top executive's goal structure as the organization's feels like an arbitrary shorthand, and it's not clear that it's always/usually possible to find even a partial order quasi-metric on the states of the world that maps more or less monotonically to those of all/most members of an organization (we can use this as a definition of organizational alignment; the fact that it's such a fetish in management perhaps suggests how rare it is!) --- and without such a global partial order, there's no way to characterize what's a mistake. The executive building up their project buzzword portfolio for their next job, the low-level employee trying to keep their job until the market improves, and the CEO trying to raise the stock price short-term just before their options vest might have entirely different outcome preferences for any given project/action, so it's hard to say any outcome was a mistake or not. At least as seen from a mathematical point of view, I don't know how I'd model that without a frame of reference that'd probably be inapplicable to most actors inside.

> In the case of the design of the Ford Pinto, the top corporate executive did not intend to release a vehicle design that endangered vehicle safety,

On the other hand, individual actions from executives in this sort of situation often retroactively don't map to safety being a a priority: whenever you cut costs by reducing safety processes with the goal of improving short-term financial metrics for immediate or short-term personal financial benefits, you are very clearly revealing a preference even when you're increasing a probability of disaster rather than making an specific one unavoidable. It might not have been their global optimum across outcomes, but I find it hard to model the situation as a mistake on their part, as it's still generally speaking a better outcome for them than the safety-and-no-bonus outcome. One could make a partial rationality argument, but that'd be stretching charity too far IMHO. Some forms of ignorance are willful enough to not be credible.

You're literally the expert on this, I'm just spitballing; but I'm also wondering how or if this applies also to issues of moral responsibility for e.g. tobacco and fossil fuel executives. If organizations can't make mistakes, maybe they can't commit moral crimes either, and there's no such thing as an organizational responsibility veil.

Paul D. Van Pelt said...

Expert? No. Thinker, yes.