Sunday, August 19, 2018

Safety culture or safety behavior?


Andrew Hopkins is a much-published expert on industrial safety who has an important set of insights into the causes of industrial accidents. Much of his career has focused on the oil and gas industry, but he has written on other sectors as well. Particularly interesting are several books: Failure to Learn: The BP Texas City Refinery Disaster; Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout; and Lessons from Longford: The ESSO Gas Plant Explosion. He also provides a number of interesting working papers here.

One of his interesting working papers is on the topic of safety culture in the drilling industry, "Why safety cultures don't work" (link).
Companies that set out to create a “safety culture” often expend huge amounts of resource trying to change the way operatives, foremen and supervisory staff think and feel about safety. The results are often disappointing. (1)
Changing the way people think is nigh impossible, but setting up organizational structures that monitor compliance with procedure, even if that procedure is seen as redundant or unnecessary, is doable. (3)
Hopkins' central point is that safety requires change of routine behavior, not in the first instance change of culture or thought. This means that management and regulatory agencies need to establish safe practices and then enforce compliance through internal and external measures. He uses the example of seat belt usage: campaigns to encourage the use of seat belts had little effect, but behavior changed when fines were imposed on drivers who continued to refrain from seat belt usage.

His central focus here, as in most of his books, is on the processes involved in the drilling industry. He makes the point that the incentives that are established in oil and gas drilling are almost entirely oriented towards maximizing speed and production. Exhortations towards "safe practices" are ineffectual in this context.

Much of his argument here comes down to the contrast between high-likelihood, low-harm accidents and low-likelihood, high-harm accidents. The steps required to prevent low-likelihood, high-harm accidents are generally not visible in the workplace, precisely because the sequences that lead to them are highly uncommon. Routine safety procedures will not reduce the likelihood of occurrence of the high-harm accident.

Hopkins offers the example of the air traffic control industry. The ultimate disaster in air traffic control is a mid-air collision. Very few such incidents have occurred. The incident Hopkins refers to was a mid-air collision over Uberlinger, Germany in 2002. But procedures in air traffic control give absolute priority to preventing such disasters, and the solution is to identify a key precursor event to a mid-air collision and ensure that these precursor events are recorded, investigated, and reacted to when they occur. The relevant precursor event in air traffic control is a proximity of two aircraft at a distance of 1.5 miles or less. The required separation is 2 miles. Air traffic control regulations and processes require a full investigation and reaction for all incidents of separation that occur with 1.5 miles of separation or less. Air traffic control is a high-reliability industry precisely because it gives priority and resources to the prevention, not only of the disastrous incidents themselves, but the the precursors that may lead to them. "This is a clear example of the way a high-reliability organization operates. It works out what the most catastrophic event is likely to be, regardless of how rare such events are in recent experience, and devises good indicators of how well the prevention of that catastrophe is being managed. It is a way of thinking that is highly unusual in the oil and gas industry" (2).

The drilling industry does not commonly follow similar high-level safety management. A drilling blowout is the incident of greatest concern in the drilling industry. There are, according to Hopkins, several obvious precursor events to a well blowout: well kicks and cementing failures. It is Hopkins' contention that safety in the drilling industry would be greatly enhanced (with respect to the catastrophic events that are both low-probability and high-harm) if procedures were reoriented so that priority attention and tracking were given to these kinds of precursor events. By reducing or eliminating the occurrence of the precursor events, major accidents would be prevented.

Another organizational factor that Hopkins highlights is the role that safety officers play within the organization. In high-reliability organizations, safety officers have an organizationally privileged role; in low-reliability organizations their voices seem to disappear in the competition among many managerial voices with other interests (speed, production, public relations). (This point is explored in an earlier post; link.)
Prior to Macondo [the Deepwater Horizon oil spill], BP’s process safety structure was decentralized. The safety experts had very little power. They lacked strong reporting lines to the centre and answered to commercial managers who tended to put production ahead of engineering excellence. After Macondo, BP reversed this. Now, what I call the “voices of safety” are powerful and heard loud and clear in the boardroom. (3)
Ominously, Hopkins makes a prescient point about the crucial role played by regulatory agencies in enhancing safety in high-risk industries.
Many regulatory regimes, however, particularly that of the US, are not functioning as they ought to. Regulators need to be highly skilled and resourced and must be able to match the best minds in industry in order to have competent discussions about the risk-management strategies of the corporations. In the US they're not doing that yet. The best practice recognized worldwide is the safety case regime, in use in UK and Norway. (4)
Given the militantly anti-regulatory stance of the current US federal administration and the aggressive lack of attention its administrators pay to scientific and technical expertise, this is a very sobering source of worry about the future of industrial, chemical, and nuclear safety in the US.

1 comment:

Willem said...

You can easily add 'financial' to that list.