Showing posts with label regulatory regime. Show all posts
Showing posts with label regulatory regime. Show all posts

Thursday, August 25, 2022

Organizational factors and nuclear power plant safety

image: Peach Bottom Nuclear Plant

The Nuclear Regulatory Commission has responsibility for ensuring the safe operations of the nuclear power reactors in the United States, of which there are approximately 100. There are significant reasons to doubt whether its regulatory regime is up to the task. Part of the challenge is the technical issue of how to evaluate and measure the risks created by complex technology systems. Part is the fact that it seems inescapable that organizational and management factors play key roles in nuclear accidents -- factors the NRC is ill-prepared to evaluate. And the third component of the challenge is the fact that the nuclear industry is a formidable adversary when it comes to "intrusive" regulation of its activities. 

Thomas Wellock is the official historian of the NRC, and his work shows an admirable degree of independence from the "company line" that the NRC wishes to present to the public. Wellock's book, Safe Enough?: A History of Nuclear Power and Accident Risk, is the closest thing we have to a detailed analysis of the workings of the commission and its relationships to the industry that it regulates. A central focus in Safe Enough is the historical development of the key tool used by the NRC in assessing nuclear safety, the methodology of "probabilistic risk assessment" (PRA). This is a method for aggregating the risks associated with multiple devices and activities involved in a complex technology system, based on failure rates and estimates of harm associated with failure. 

This preoccupation with developing a single quantitative estimate of reactor safety reflects the engineering approach to technology failure. However, Charles Perrow, Diane Vaughan, Scott Sagan, and numerous other social scientists who have studied technology hazards and disasters have made clear that organizational and managerial failures almost always play a key role in the occurrence of a major accident such as Three Mile Island, Fukushima, or Bhopal. This is the thrust of Perrow's "normal accident" theory and Vaughan's "normalization of deviance" theory. And organizational effectiveness and organizational failures are difficult to measure and quantify. Crucially, these factors are difficult to incorporate into the methodology of probabilistic risk assessment. As a result, the NRC has almost no ability to oversee and enforce standards of safety culture and managerial effectiveness.

Wellock addresses this aspect of an incomplete regulatory system in "Social Scientists in an Adversarial Environment: The Nuclear Regulatory Commission and Organizational Factors Research" (link). The problem of assessing "human factors" has been an important element of the history of the NRC's efforts to regulate the powerful nuclear industry, and failure in this area has left the NRC handicapped in its ability to address pervasive ongoing organizational faults in the nuclear industry. Wellock's article provides a detailed history of efforts by the NRC to incorporate managerial assessment and human-factors analysis into its safety program -- to date, with very little success. And, ironically, the article demonstrates a key dysfunction in the organization and setting of the NRC itself; because of the adversarial relationship that exists with the nuclear industry, and the influence that the industry has with key legislators, the NRC is largely blocked from taking commonsense steps to include evaluation of safety culture and management competence into its regulatory regime.

Wellock makes it clear that both the NRC and the public have been aware of the importance of organizational dysfunctions in the management of nuclear plants since the Three Mile Island accident in 1979. However, the culture of the organization itself makes it difficult to address these dysfunctions. Wellock cites the experience of Valerie Barnes, a research psychologist on staff at the NRC, who championed the importance of focusing attention on organizational factors and safety culture. "She recalled her engineering colleagues did not understand that she was an industrial psychologist, not a therapist who saw patients. They dismissed her disciplinary methods and insights into human behavior and culture as 'fluffy,' unquantifiable, and of limited value in regulation compared to the hard quantification bent of engineering disciplines" (1395). 

The NRC took the position that organizational factors and safety culture could only properly be included in the regulatory regime if they could be measured, validated, and incorporated into the PRA methodology. The question of the quantifiability and statistical validity of human-factors research and safety-culture research turned out to be insuperable -- largely because these were the wrong standards for evaluating the findings of these areas of the social sciences. "In the new program [in the 1990s], the agency avoided direct evaluation of unquantifiable factors such as licensee safety culture" (1395). (It is worth noting that this presumption reflects a thoroughly positivistic and erroneous view of scientific knowledge; linklink. There are valid methods of sociological investigation that do not involve quantitative measurement.) 

After the Three Mile Island disaster, both the NRC and external experts on nuclear safety had a renewed interest in organizational effectiveness and safety culture. Analysis of the TMI disaster made organizational dysfunctions impossible to ignore. Studies by the Battelle Human Affairs Research Center were commissioned in 1982 (1397), to permit design of a regulatory regime that would evaluate management effectiveness. Here again, however, the demand for quantification and "correlations" blocked the creation of a regulatory standard for management effectiveness and safety culture. Moreover, the nuclear industry was able to resist efforts to create "intrusive" inspection regimes involving assessment of management practices. "In the mid-1980s, the NRC deferred to self-regulating initiatives under the leadership of the Institute for Nuclear Power Operations (INPO). This was not the first time the NRC leaned on INPO to avoid friction with industry" (1397). 

A serious event at the Davis-Besse plant in Ohio in 1983 focused attention on the importance of management, organizational dysfunction, and safety culture, and a National Academy of Sciences report in 1988 once again recommended that the NRC must give high priority to these factors -- quantifiable or not (Human Factors Research and Nuclear Safety; link).

The panel called on the NRC to prioritize research into organizational and management factors. “Management can make or break a plant,” Moray told the NRC’s Advisory Committee for Reactor Safeguards. Even more than the man-machine interface, he said, it was essential that the NRC identify what made for a positive organizational culture of reliability and safety and develop appropriate regulatory feedback mechanisms that would reduce accident risk. (1400)

These recommendations led  the NRC to commission an extensive research consultancy with a group of behavioral scientists at Brookhaven Laboratory. The goal of this research, once again, was to identify observable and measurable factors of organizations and safety culture that would permit quantification of the quality of both intangible features of nuclear plants -- and ultimately to permit incorporation of these factors into PRA models. 

 Investigators identified over 20 promising organizational factors under five broad categories of control systems, communications, culture, decision making, and personnel systems. Brookhaven concluded the best measurement methodologies included research surveys, behavioral checklists, structured interview protocols, and behavioral-anchored rating scales. (1401)

However, this research foundered on three problems: the cost of evaluating a nuclear operator on this basis; the "intrusiveness" of the methods needed to evaluate these organizational systems, and the intransigent and adversarial opposition of the operators of nuclear plants against these kinds of assessment. It also emerged that it was difficult to establish correlations between the organizational factors identified and the safety performance of a range of plants. NRC backed down from its effort to directly assess organizational effectiveness and safety culture, and instead opted for a new "Reactor Oversight Process" (ROP) that made use only of quantitative factors associated with safety performance (1403).

A second and more serious incident at the Davis-Besse nuclear plant in 2002 resulted in a near-miss loss-of-coolant accident (link), and investigation by NRC and GAO compelled the NRC to once again bring safety culture back into the regulatory agenda. Executives, managers, operators, and inspectors were all found to have behaved in ways that greatly increased the risk of a highly damaging LOCA accident at Davis-Besse. The NRC imposed more extensive organizational and managerial requirements on the operators of the Davis-Besse plant, but these protocols were not extended to other plants.

It is evident from Wellock's 2021 survey of the NRC history of human-factors research and organizational research that the commission is currently incapable of taking seriously the risks to reactor safety created by the kinds of organizational failures documented by Charles Perrow, Diane Vaughan, Andrew Hopkins, Scott Sagan, and many others. NRC has shown that it is aware of these social-science studies of technology system safety. But its intellectual commitment to a purely quantitative methodology for risk assessment, combined with the persistent ability of the nuclear operators to prevent forms of "intrusive" evaluation that they don't like, leads to a system in which major disasters remain a distinct possibility. And this is very bad news for anyone who lives within a hundred miles of a nuclear power plant.


Friday, December 10, 2021

China's food-safety governance system


Food safety is a very high-level concern for ordinary consumers. This is true because the food we eat can poison us or ruin our health, and yet consumers have little ability to evaluate the safety of the foods available in the marketplace. Therefore government regulation of food safety appears to be mandatory in any complex society. 

Regulation requires several things: science-based regulations on processes and composition of the products that are regulated, consistent and disinterested inspection, effective enforcement of regulations and violations, and oversight by a regulatory agency that is independent from the industry being regulated and insulated from the general political interests of the government within which it exists.

China has experienced many food-contamination scandals in the past twenty years, and food safety is ranked as a high-level concern by many Chinese citizens. In his 2012 post on food safety in China in the Council on Foreign Relations blog (link), Yanzhong Huang writes that "in the spring of 2012, a survey carried out in sixteen major Chinese cities asked urban residents to list 'the most worrisome safety concerns.' Food safety topped the list (81.8%), followed by public security (49%), medical care safety (36.4%), transportation safety (34.3%), and environmental safety (20.1%)". And the public anxiety is well justified; (link). In 2012 Bi Jingquan, then the head of the China Food and Drug Administration, testified that "Chinese food safety departments conducted more than 15 million individual inspections in the first three quarters of the year and found more than 500,000 incidents of illegal behavior" (link). Especially notorious is the milk-melamine contamination scandal of 2008, resulting in hospitalization of over 50,000 affected children and at least six deaths of children and infants.

The question of interest here has to do with China's governmental system of regulation of safety in the food system. What are the regulatory arrangements currently in place? And do these governmental systems provide a basis for a reasonable level of confidence in the quality and safety of China's food products?

Liu, Mutukumira, and Chen 2019 (link) provide a detailed and comprehensive analysis of the evolution of food-safety regulation in China since 1949. This resview article is worth studying in detail for the light it sheds on the challenge of establishing an effective system of regulation in a vast population governed by a single-party state. The article is explicit about the food-safety problems that persist on a wide scale in China:

Food safety incidents still occur, including abuse of food additives, adulterated products as well as contamination by pathogenic microorganisms, pesticides, veterinary drug residues, and heavy metals, and use of substandard materials. (abstract)

The authors refer to a number of important instances of widespread food contamination and dangerous sanitary conditions, including "spicy gluten strips" consumed by teenagers.

Liu et al recommend "coregulation" for the China food system, in which government and private producers each play a crucial role in evaluating and ensuring safe food processes and products. They refer to the "Hazard Analysis Critical Control Point (HACCP) system" that should be implemented by food producers and processors (4128), and they emphasize the need in China for a system that succeeds in ensuring safe food at low regulatory cost.

Increasing number of countries uses new coregulation schemes focusing on a specific type of coregulation where regulations are developed by public authorities and then implemented by the coordinated actions of public authorities and food operators or “enforced self-regulation” (Guo, Bai, & Gong, 2019; Rouvière & Caswell, 2012).... Coregulation aims to combine the advantages of the predictability and binding nature of legislation with the flexibility of self-regulatory approaches. (4128)

Here is their outline of the chronology of food-safety regimes in China since 1949:



Previous posts have discussed some of the organizational dysfunctions associated with "coregulation" and its cognate concepts (link). The failures of design and implementation of the Boeing 737 Max are attributed in large part to the system of delegated regulation used by the Federal Aviation Administration (link, link). And the Nuclear Regulatory Commission too appears to defer extensively to "industry expertise" in its approach to regulation (link). The problems of regulatory capture and weak, ineffective governmental regulatory institutions are well understood in the US and Europe. And this experience supports a healthy skepticism about the likely effectiveness of "coregulation" in China's food system as well. Earlier posts have emphasized the importance of independence of regulatory agencies from both the political interests of the government and the economic interests of the industry that they regulate. This independence appears to be all but impossible in China's governmental structure and Party rule.

Another weakness identified in Liu et al concerns the level and organizational home of enforcement of food-safety regulations. "The supervision of food safety is mainly dependent on law enforcement departments" (4128). This system is organizationally flawed for several reasons. First, it implies a lack of coordination, with different jurisdictions (cities, provinces, counties) exercising different levels and forms of enforcement. And second, it raises the prospect of corruption, both petty and large, in which inspectors, supervisors, and enforcers are induced to look the other way at infractions. This problem was noted in a prior post on fire safety regulation in China (link). The localism inherent in the food safety system in China is evident in Figure 1:



And the authors highlight the dysfunction that is latent in this diagram:

The local government is responsible for food safety information. At the same time, the local government accepts the leadership of the central government and is responsible to the central government, which forms a principal-agent relationship under asymmetric information. Meanwhile, food producers are in a position of information superiority over local governments and are regulated by the local governments. Therefore, the relationship of the central government, local governments, and food producers is multiple principal-agent relationship. Under the standard of fiscal decentralization and political assessment, local governments are both food safety regulatory agencies and regional competitive entities, so the collusion between local governments, or different counties, and enterprises becomes a rational choice (Tirole, 1986). (4134)

It appears incontrovertible that "publicity" is an important factor in enhancing safety in any industry. If the public is informed about incidents -- whether food safety, chemical plant spills, or nuclear disasters -- their concerns can lead to full and rigorous accident investigation and process changes supporting greater safety in the future. Conversely, if government suppresses news media in its ability to provide information about these kinds of incidents, there is much less public pressure leading to more effective safety regulation. Chinese leaders' determination to tightly control the flow of information is decidedly harmful for the goal of increasing food safety and other dimensions of environmental safety.

Liu et al describe the progression of food safety laws and policies over five decades, and they appear to believe that the situation of food safety has improved in the most recent period. They also note, however, that much remains to be done:

With the enactment of the 2015 FSL, China developed and reinforced various regulatory tools. However, there are areas of the law and regulation that need further work, such as effective coordination among government agencies, a focus on appropriate risk communication, facilitating social governance and responsibility, nurturing a food safety culture from bottom-up, and assisting farmers at the primary level (Roberts & Lin, 2016). (4131)

These areas for future improvement are fundamental for establishing a secure and effective safety regime -- whether in the area of food safety or other areas of environmental and industrial safety. And to these we may add several more important factors that are currently absent: independence of regulatory agencies from government direction and industry capture; lack of freedom of information permitting the public to be well informed about incidents when they occur; and an enforcement system that fails to deter and ameliorate bad performance and process inadequacies.


Friday, October 15, 2021

Fire safety in urban China


A rapidly rising percentage of the Chinese population is living in high-rise apartment buildings in hundreds of cities around the country. There is concern, however, about the quality and effectiveness of fire-safety regulation and enforcement for these buildings (as well as factories, warehouses, ports, and other structures). This means that high-rise fires represent a growing risk in urban China. Here is a news commentary from CGTN (link) in 2010 describing a particularly tragic high-rise fire that engulfed a 28-story building in Shanghai, killing 58 people. This piece serves to identify the parameters of the problem of fire safety more generally.

It is of course true that high-rise fires have occurred in many cities around the world, including the notorious Grenfell Tower disaster in 2017. And many of those fires also reflect underlying problems of safety regulation in the jurisdictions in which they occurred. But the problems underlying infrastructure safety seem to converge with particular seriousness in urban China. And, crucially, major fire disasters in other countries are carefully scrutinized in public reports, providing accurate and detailed information about the causes of the disaster. This scrutiny creates the political incentive to improve building codes, inspection regimes, and enforcement mechanisms of safety regulations. This open and public scrutiny is not permitted in China today, leaving the public largely ignorant of the background causes of fires, railway crashes, and other large accidents.

It is axiomatic that modern buildings require effective and professionally grounded building codes and construction requirements, adequate fire safety system requirements, and rigorous inspection and enforcement regimes that ensure a high level of compliance with fire safety regulations. Regrettably, it appears that no part of this prescription for fire safety is well developed in China.

The CGTN article mentioned above refers to the "effective" high-level fire safety legislation that the central government adopted in 1998, the Fire Control Law of the People's Republic of China (link), and this legislation warrants close study. However, close examination suggests that this guiding legislation lacks crucial elements that are needed in order to ensure compliance with safety regulations -- especially when compliance is highly costly for the owners/managers of buildings and other facilities. Previous disasters in China suggest a pattern: poor inspection and enforcement prior to an accident or fire, followed by prosecution and punishment of individuals involved in the occurrence of the disaster in the aftermath. But this is not an effective mechanism for ensuring safety. Owners, managers, and officials are more than ready to run the small risk of future prosecution for the sake of gains in the costs of present operations of various facilities.

The systemic factors that act against fire safety in China include at least these pervasive social and political conditions: ineffective and corrupt inspection offices, powerful property managers who are able to ignore safety violations, pressure from the central government to avoid interfering with rapid economic growth, government secrecy about disasters when they occur, and lack of independent journalism capable of freely gathering and publishing information about disasters.

In particular, the fact that the news media (and now social media as well) are tightly controlled in China is a very serious obstacle to improving safety when it comes to accidents, explosions, train wrecks, and fires. The Chinese news media do not publish detailed accounts of disasters as they occur, and they usually are unable to carry out the investigative journalism needed to uncover background conditions that have created the circumstances in which these catastrophes arise (ineffective or corrupt inspection regimes; enforcement agencies that are hampered in their work by the political requirements of the state; corrupt practices by private owners/managers of high-rise properties, factories, and ports; and so on). It is only when the public can become aware of the deficiencies in government and business that have led to a disaster, that reforms can be designed and implemented that make those disasters less likely in the future. But the lack of independent journalism means leaving the public in the dark about these important details of their contemporary lives.

The story quoted above is from CGTN, a Chinese news agency, and this story is unusual for its honesty in addressing some of the deficiencies of safety management and regulation in Shanghai. CGTN is an English-language Chinese news service, owned and operated by Chinese state-owned media organization China Central Television (CCTV). As such it is under full editorial control by offices of the Chinese central government. And the government is rarely willing to have open and honest reporting of major disasters, and the organizational, governmental, and private dysfunctions that led to them. It is noteworthy, therefore, that the story is somewhat explicit about the dysfunctions and corruption that led to the Shanghai disaster. The article quotes an article in China Daily (owned by the publicity department of the CCP) that refers to poor enforcement and corruption:

However, a 2015 article by China Daily called for the Fire Control Law to be more strictly enforced, saying that the Chinese public now “gradually takes it for granted that when a big fire happens there must be a heavy loss of life.”

While saying “China has a good fire protection law,” the newspaper warned that it was frequently violated, with fire engine access blocked by private cars, escape routes often blocked and flammable materials still being “widely used in high buildings.”

The article also pointed at corruption within fire departments, saying inspections have “become a cash cow,” with businesses and construction companies paying bribes in return for lax safety standards being ignored.

So -- weak inspections, poor compliance with regulations, and corruption. Both the CCTV report and the China Daily story it quotes are reasonably explicit about unpalatable truths. But note -- the CGTN story was prepared for an English-speaking audience, and is not available to ordinary Chinese readers in China. And this appears to be the case for the China Daily article that was quoted as well. And most importantly -- the political climate surrounding the journalistic practices of China Daily has tightened very significantly since 2015.

Another major institutional obstacle to safety in China is the lack of genuinely independent regulatory safety agencies. The 1998 Fire Control Law of the People's Republic of China is indicative. The legislation refers to the responsibility of local authorities (provincial, municipal) to establish fire safety organizations; but it is silent about the nature, resources, and independence of inspection authorities. Here is the language of the first several articles of the Fire Control Law:

Article 2 Fire control work shall follow the policy of devoting major efforts into prevention and combining fire prevention with fire fighting, and shall adhere to the principle of combining the efforts of both specialized organizations and the masses and carry out responsibility system on fire prevention and safety.

Note that this article immediately creates a confusion of responsibility concerning the detailed tasks of establishing fire safety: "specialized organizations" and "the masses" carry out responsibility.

Article 3 The State Council shall lead and the people's governments at all levels be responsible for fire control work. The people's government at all levels shall bring fire control work in line with the national economy and social development plan, and ensure that fire control work fit in with the economic construction and social development.

Here too is a harmful diffusion of responsibility: "the people's governments at all levels [shall] be responsible ...". In addition a new priority is introduced: consistency with the "national economy and social development plan". This implies that fire safety regulations and agencies at the provincial and municipal level must balance economic needs with the needs of ensuring safety -- a potentially fatal division of priorities. If substituting a non-flammable cladding to an 80-story residential building will add one billion yuan to the total cost of the building -- does this requirement impede the "national economy and development plan"? Can the owner/managers resist the new regulation on the grounds that it is too costly?

Article 4 The public security department of the State Council shall monitor and administer the nationwide fire control work; the public security organs of local people's governments above county level shall monitor and administer the fire control work within their administrative region and the fire control institutions of public security organs of the people's government at the same level shall be responsible for the implementation. Fire control work for military facilities, underground parts of mines and nuclear power plant shall be monitored and administered by their competent units. For fire control work on forest and grassland, in case there are separate regulations, the separate regulations shall be followed.

Here we find specific institutional details about oversight of "nationwide fire control work": it is the public security organs that are tasked to "monitor and administer" fire control institutions. Plainly, the public security organs have no independence from the political authorities at provincial and national levels; so their conduct is suspect when it comes to the task of "independent, rigorous enforcement of safety regulations".

Article 5 Any unit and individual shall have the obligation of keeping fire control safety, protecting fire control facilities, preventing fire disaster and reporting fire alarm. Any unit and adult shall have the obligation to take part in organized fire fighting work.

Here we are back to the theme of diffusion of responsibility. "Any unit and individual shall have the obligation of keeping fire control safety" -- this statement implies that there should not be free-standing, independent, and well-resourced agencies dedicated to ensuring compliance with fire codes, conducting inspections, and enforcing compliance by reluctant owners.

It seems, then, that the 1998 Fire Control Law is largely lacking in what should have been its primary purpose: specification of the priority of fire safety, establishment of independent safety agencies at various levels of government with independent power of enforcement, and with adequate resources to carry out their fire safety missions, and a clear statement that there should be no interference with the proper inspection and enforcement activities of these agencies -- whether by other organs of government or by large owner/operators.

The 1998 Fire Control Law was extended in 2009, and a chapter was added entitled "Supervision and Inspection". Clauses in this chapter offer somewhat greater specificity about inspections and enforcement of fire-safety regulation. Departments of local and regional government are charged to "conduct targeted fire safety inspections" and "promptly urge the rectification of hidden fire hazards" (Article 52). (Notice that the verb "urge" is used rather than "require".) Article 53 specifies that the police station (public security) is responsible for "supervising and inspecting the compliance of fire protection laws and regulations". Article 54 addresses the issue of possible discovery of "hidden fire hazards" during fire inspection; this requires notification of the responsible unit of the necessity of eliminating the hazard. Article 55 specifies that if a fire safety agency discovers that fire protection facilities do not meet safety requirements, it must report to the emergency management department of higher-level government in writing. Article 56 provides specifications aimed at preventing corrupt collaboration between fire departments and units: "Fire rescue agencies ... shall not charge fees, shall not use their positions to seek benefits". And, finally, Article 57 specifies that "all units and individuals have the right to report and sue the illegal activities of the authorities" if necessary. Notice, however, that, first, all of this inspection and enforcement activity occurs within a network of offices and departments dependent ultimately on central government; and second, the legislation remains very unspecific about how this set of expectations about regulation, inspection, and enforcement is to be implemented at the local and provincial levels. There is nothing in this chapter that gives the observer confidence that effective regulations will be written; effective inspection processes will be carried out; and failed inspections will lead to prompt remediation of hazardous conditions.

The Tianjin port explosion in 2015 is a case in point (link, link). Poor regulations, inadequate and ineffective inspections, corruption, and bad behavior by large private and governmental actors culminated in a gigantic pair of explosions of 800 tons of ammonium nitrate. This was one of the worst industrial and environmental disasters in China's recent history, and resulted in the loss of 173 lives, including 104 poorly equipped fire fighters. Prosecutions ensued after the disaster, including the conviction and suspended death sentence of Ruihai International Logistics Chairman Yu Xuewei for bribery, and the conviction of 48 other individuals for a variety of crimes (link). But punishment after the fact is no substitute for effective, prompt inspection and enforcement of safety requirements.

It is not difficult to identify the organizational dysfunctions in China that make fire safety, railway safety, food safety, and perhaps nuclear safety difficult to attain. What is genuinely difficult is to see how these dysfunctions can be corrected in a single-party state. Censorship, subordination of all agencies to central control, the omnipresence of temptations to corrupt cooperation -- all of these factors seem to be systemic within a one-party state. The party state wants to control public opinion; therefore censorship. The party state wants to control all political units; therefore a lack of independence for safety agencies. And positions of decision-making that create lucrative "rent-seeking" opportunities for office holders -- therefore corruption, from small payments to local inspectors to massive gifts of wealth to senior officials. A pluralistic, liberal society embodying multiple centers of power and freedom of press and association is almost surely a safer society. Ironically, this was essentially Amartya Sen's argument in Poverty and Famines: An Essay on Entitlement and Deprivation, his classic analysis of famine and malnutrition: a society embodying a free press and reasonably free political institutions is much more likely to respond quickly to conditions of famine. His comparison was between India in the Bengal famine (1943) and China in the Great Leap Forward famine (1959-61).

Here is a Google translation of Chapter V of the 2009 revision of the Fire Protection Law of the People's Republic of China mentioned above.

Chapter V Supervision and Inspection

Article 52 Local people's governments at all levels shall implement a fire protection responsibility system and supervise and inspect the performance of fire safety duties by relevant departments of the people's government at the same level.

The relevant departments of the local people's government at or above the county level shall, based on the characteristics of the system, conduct targeted fire safety inspections, and promptly urge the rectification of hidden fire hazards.

Article 53 Fire and rescue agencies shall supervise and inspect the compliance of fire protection laws and regulations by agencies, organizations, enterprises, institutions and other entities in accordance with the law. The police station may be responsible for daily fire control supervision and inspection, and conduct fire protection publicity and education. The specific measures shall be formulated by the public security department of the State Council.

The staff of fire rescue agencies and public security police stations shall present their certificates when conducting fire supervision and inspection.

Article 54: Fire rescue agencies that discover hidden fire hazards during fire supervision and inspection shall notify relevant units or individuals to take immediate measures to eliminate the hidden hazards; if the hidden hazards are not eliminated in time and may seriously threaten public safety, the fire rescue agency shall deal with the dangerous parts in accordance with regulations. Or the place adopts temporary sealing measures.

Article 55: If the fire rescue agency discovers that the urban and rural fire safety layout and public fire protection facilities do not meet the fire safety requirements during the fire supervision and inspection, or finds that there is a major fire hazard affecting public safety in the area, it shall report to the emergency management department in writing. Level People’s Government.

The people's government that receives the report shall verify the situation in a timely manner, organize or instruct relevant departments and units to take measures to make corrections.

Article 56 The competent department of housing and urban-rural construction, fire rescue agencies and their staff shall conduct fire protection design review, fire protection acceptance, random inspections and fire safety inspections in accordance with statutory powers and procedures, so as to be fair, strict, civilized and efficient.

Housing and urban-rural construction authorities, fire rescue agencies and their staff shall conduct fire protection design review, fire inspection and acceptance, record and spot checks and fire safety inspections, etc., shall not charge fees, shall not use their positions to seek benefits; they shall not use their positions to designate or appoint users, construction units, or Disguisedly designate the brand, sales unit or fire-fighting technical service organization or construction unit of fire-fighting equipment for fire-fighting products.

Article 57 The competent housing and urban-rural construction departments, fire and rescue agencies and their staff perform their duties, should consciously accept the supervision of society and citizens.

All units and individuals have the right to report and sue the illegal activities of the housing and urban-rural construction authorities, fire and rescue agencies and their staff in law enforcement. The agency that receives the report or accusation shall investigate and deal with it in a timely manner in accordance with its duties.

*    *    *    *    *

(Here is a detailed technical fire code for China from 2014 (link).)


Saturday, December 7, 2019

Why do regulatory organizations fail?


Why is Charles Perrow a pessimist about government regulation?

Perrow is a leading researcher in the sociology of organizations, and he is a singular expert on accidents and failures. Several of his books are classics in their field -- Normal Accidents: Living with High-Risk Technologies, The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters, Organizing America: Wealth, Power, and the Origins of Corporate Capitalism. So why is he so gloomy about the ability of governmental organizations to protect the public from large failures and disasters of various kinds -- hurricanes, floods, chemical plant fires, software failures, terrorism? He is not a relentless critic of organizations such as the EPA, the Department of Justice, or the Food and Drug Administration, but his assessment of their capacity for success is dismal.
We should not expect too much of organizations, but the DHS is extreme in its dysfunctions. As with all organizations, the DHS has been used by its masters and outsiders for purposes that are beyond its mandate, and the usage of the DHS has been extreme. One major user of the DHS is Congress. While Congress is the arm of the government that is closest to the people, it is also the one that is most influenced by corporations and local interest groups that do not have the interests of the larger community in mind. (The Next Catastrophe, kl 205)
I don't think that Perrow's views derive from the general skeptical view that organizations never succeed in accomplishing the functions we assign to them -- hospitals, police departments, labor unions, universities, public health departments. And in fact his important book Complex Organizations: A Critical Essay provided a constructive description of the field of organizational studies when it appeared in 1972 and was updated in 2014 (link).)

Instead, there seem to be particular reasons why large governmental organizations designed to protect the public are likely to fail, in Perrow's assessment. It is organizations that are designed to regulate risky activities and those that are charged to create prudent longterm plans for the future that seem particularly vulnerable, in his account. So what are those reasons for failure in these kinds of organizations?

FEMA is faulted, for example, because of its failure to adequately plan for and provide emergency relief to the people of New Orleans and other parts of the Gulf region from the effects of Hurricane Katrina. Poor planning, incompetent executives at the top, politicized directions coming from the White House, poor coordination across sub-units, and poor internal controls eventually resulted in a historic failure. These are fairly routine organizational failures that could happen within the United Parcel Service corporate headquarters as easily as Washington.

The Nuclear Regulatory Commission is faulted for its oversight of safety in nuclear plants, including Three Mile Island, Davis-Besse, and Shoreham. Key organizational faults include regulatory capture by owners and the nuclear industry, excessive dependence on specific key legislators, commissioners who are politically beholden, and insufficient personnel to carry out intensive inspection regimes.

Perrow's key ideas about failures in the industrial systems themselves seem not to be central in his negative assessment of government regulatory organizations. The features of "complex systems" and "tightly coupled processes" that are so central to his theory of normal accidents in industrial systems like nuclear power plants play only incidental roles in his analysis of regulatory failure. Agencies are neither complex nor tightly coupled in the way a petroleum processing plant is. In fact, an outside observer might hypothesize that a somewhat more tightly coupled system in the NRC or the EPA (a more direct connection among the scientists, engineering experts, inspectors, and commissioners) might actually improve performance.

Instead, his analysis of regulatory failure depends on a different set of axes: interests, influence, and power. Regulatory agencies fail, in Perrow's accounts, when their top administrators have bureaucratic interests and dependencies that diverge from the mission of safety, when powerful outsiders and owners have the capacity to influence rules, policies, and implementation, and when political and economic power is deployed to protect the interests of powerful actors. (All these defects are apparent in Trump administration appointments to federal agencies with regulatory responsibilities.)

Interestingly, these factors have also played a central role in his sociological thinking about the emergence of the twentieth-century corporation; he views corporations as vehicles for the concentration of power:
Our economic organizations -- business and industry -- concentrate wealth and power; socialize employees and customers alike to meet their needs; and pass off to the rest of society the cost of their pollution, crowding, accidents, and encouragement of destructive life styles. In the vaunted "free market" economy of the United States, regulation of business and industry to prevent or mitigate this market failure is relatively ineffective, as compared to that enacted by other industrialized countries. (Organizing America, 1-2)
So the primary foundation of Perrow's assessment of the linked of organizational failure when it comes to government regulation derives from the role that economic and political power plays in deforming the operations of major government organizations to serve the interests of the powerful. Regulatory agencies are "captured" by the powerful industries they are supposed to oversee, whether through influence on the executive branch or through merciless lobbying of the legislative branch. Commissioners are often very sympathetic to the business needs of the sector they regulate, and strive to avoid "undue regulatory burden".

This leads us to a fascinating question: is there a powerful constituency for safety that could be a counterweight to corporate power and a bulwark for honest, scientifically guided regulatory regimes? Is a more level playing field between economic interests and the public's interests in effective safety regulation possible?

We may want to invoke the public at large, and it is true that public opinion sometimes effectively demands government intervention for safety. But the public is generally limited in several important ways. Only a small set of issues manage to become salient for the public. Further, issues only remain salient for a limited period of time. And the salience of an issue is often geographically and demographically bounded. There was intense opposition to the Shoreham nuclear plant siting decision on Long Island, but the public in Chicago and Dallas did not mobilize around the issue. Sometimes vocal public opinion prevails, but much more common is the scenario where public interest wanes and profit-motivated corporate interests persists. (Pepper Culpepper lays out the logic of salience and unequal power between a diffuse public and a concentrated corporate interest in Quiet Politics and Business Power: Corporate Control in Europe and Japan.)

Other pertinent voices for safety are public interest organizations -- the Union of Concerned Scientists, Friends of the Earth, Bulletin of Atomic Scientists. Organizations like these have succeeded in creating a national base of support, they have drawn resources in support of their efforts, and they have a greater organizational capacity to persist over an extended period of time. (In another field of advocacy, organizations like Anti-Defamation League and the Southern Poverty Law Center have succeeded in maintaining organizational focus on the dangers of hate-based movements.) So public interest organizations sometimes have the capacity and staying power to advocate for stronger regulation.

Investigative journalism and a free press are also highly relevant in exposing regulatory failures and enhancing performance of safety organizations. The New York Times and Washington Post coverage of the FAA's role in certification of the 737 Max will almost certainly lead to improvements in this area of aircraft safety. (Significantly, when I made this statement concerning the link between industrial safety in China and a free press, I was told that "this is a sensitive subject in China.")

(These examples are drawn from the national level of government. Sometimes local government -- e.g. police departments and zoning boards -- are captured as well, when organized crime "firms" and land developers are able to distort regulations and enforcement in their favor. But it may be that organizations at this level of government are a bit more visible to their publics, and therefore somewhat less likely to bend to the dictates of powerful local interests. Jessica Troundstine addresses these kinds of issues in Political Monopolies in American Cities: The Rise and Fall of Bosses and Reformers (link).

Monday, October 28, 2019

Regulatory delegation at the FAA


Earlier posts have focused on the role of inadequate regulatory oversight as part of the tragedy of the Boeing 737 MAX (link, link). (Also of interest is an earlier discussion of the "quiet power" through which business achieves its goals in legislation and agency rules (link).) Reporting in the New York Times this week by Natalie Kitroeff and David Gelles provides a smoking gun for the idea of regulatory capture by industry over the regulatory agency established to ensure its safe operations (link). The article quotes a former attorney in the FAA office of chief counsel:
“The reauthorization act mandated regulatory capture,” said Doug Anderson, a former attorney in the agency’s office of chief counsel who reviewed the legislation. “It set the F.A.A. up for being totally deferential to the industry.”
Based on exhaustive investigative journalism, Kitroeff and Gelles provide a detailed account of the lobbying strategy and efforts by Boeing and the aircraft manufacturing industry group that led to the incorporation of industry-favored language into the FAA Reauthorization Act of 2018, and it is a profoundly discouraging account for anyone interested in the idea that the public good should drive legislation. The new paragraphs introduced into the final legislation stipulate full implementation of the philosophy of regulatory delegation and establish an industry-centered group empowered to oversee the agency's performance and to make recommendations about FAA employees' compensation. "Now, the agency, at the outset of the development process, has to hand over responsibility for certifying almost every aspect of new planes." Under the new legislation the FAA is forbidden from taking back control of the certification process for a new aircraft without a full investigation or inspection justifying such an action.

As the article notes, the 737 MAX was certified under the old rules. The new rules give the FAA even less oversight powers and responsibilities for the certification of new aircraft and major redesigns of existing aircraft. And the fact that the MCAS system was never fully reviewed by the FAA, based on assurances of its safety from Boeing, reduces even further our confidence in the effectiveness of the FAA process. From the article:
The F.A.A. never fully analyzed the automated system known as MCAS, while Boeing played down its risks. Late in the plane’s development, Boeing made the system more aggressive, changes that were not submitted in a safety assessment to the agency.
Boeing, the Aerospace Industries Association, and the General Aviation Manufacturers Association exercised influence on the 2018 legislation through a variety of mechanisms. Legislators and lobbyists alike were guided by a report on regulation authored by Boeing itself. Executives and lobbyists exercised their ability to influence powerful senators and members of Congress through person-to-person interactions. And elected representatives from both parties favored "less regulation" as a way of supporting the economic interests of businesses in their states. For example:
They also helped persuade Senator Maria Cantwell, Democrat of Washington State, where Boeing has its manufacturing hub, to introduce language that requires the F.A.A. to relinquish control of many parts of the certification process.
And, of course, it is important not to forget about the "revolving door" from industry to government to lobbying firm. Ali Bahrami was an FAA official who subsequently became a lobbyist for the aerospace industry; Stephen Dixon is a former executive of Delta Airlines who now serves as Administrator of the FAA; and in 2007 former FAA Administrator Marion Blakey became CEO of the Aerospace Industries Association, the industry's chief advocacy and lobbying group (link). It is hard to envision neutral, objective judgment in ensuring the safety of the public from such appointments.
Boeing and its allies found a receptive audience in the head of the House transportation committee, Bill Shuster, a Pennsylvania Republican staunchly in favor of deregulation, and his aide working on the legislation, Holly Woodruff Lyons.
These kinds of influence on legislation and agency action provide crystal-clear illustrations of the mechanisms cited by Pepper Culpepper in Quiet Politics and Business Power: Corporate Control in Europe and Japan explaining the political influence of business. Here is my description of his views in an earlier post:
Culpepper unpacks the political advantage residing with business elites and managers in terms of acknowledged expertise about the intricacies of corporate organization, an ability to frame the issues for policy makers and journalists, and ready access to rule-writing committees and task forces. These factors give elite business managers positional advantage, from which they can exert a great deal of influence on how an issue is formulated when it comes into the forum of public policy formation.
It seems abundantly clear that the "regulatory delegation" movement and its underlying effort to reduce regulatory burden on industry have gone too far in the case of aviation; and the same seems true in other industries such as the nuclear industry. The much harder question is organizational: what form of regulatory oversight would permit a regulatory industry to genuinely enhance the safety of the regulated industry and protect the public from unnecessary hazards? Even if we could take the anti-regulation ideology that has governed much public discourse since the Reagan years out of the picture, there are the continuing issues of expertise, funding, and industry power of resistance that make effective regulation a huge challenge.

Monday, August 12, 2019

Testing the NRC


Serious nuclear accidents are rare but potentially devastating to people, land, and agriculture. (It appears that minor to moderate nuclear accidents are not nearly so rare, as James Mahaffey shows in Atomic Accidents: A History of Nuclear Meltdowns and Disasters: From the Ozark Mountains to Fukushima.) Three Mile Island, Chernobyl, and Fukushima are disasters that have given the public a better idea of how nuclear power reactors can go wrong, with serious and long-lasting effects. Reactors are also among the most complex industrial systems around, and accidents are common in complex, tightly coupled industrial systems. So how can we have reasonable confidence in the safety of nuclear reactors?

One possible answer is that we cannot have reasonable confidence at all. However, there are hundreds of large nuclear reactors in the world, and 98 active nuclear reactors in the United States alone. So it is critical to have highly effective safety regulation and oversight of the nuclear power industry. In the United States that regulatory authority rests with the Nuclear Regulatory Commission. So we need to ask the question: how good is the NRC at regulating, inspecting, and overseeing the safety of nuclear reactors in our country?

One would suppose that there would be excellent and detailed studies within the public administration literature that attempt to answer this question, and we might expect that researchers within the field of science and technology studies might have addressed it as well. However, this seems not to be the case. I have yet to find a full-length study of the NRC as a regulatory agency, and the NRC is mentioned only twice in the 600-plus page Oxford Handbook of Regulation. However, we can get an oblique view of the workings of the NRC through other sources. One set of observers who are in a position to evaluate the strengths and weaknesses of the NRC are nuclear experts who are independent of the nuclear industry. For example, publications from the Bulletin of the Atomic Scientists include many detailed reports on the operations and malfunctions of nuclear power plants that permit a degree of assessment of the quality of oversight provided by the NRC (link). And a detailed (and scathing) report by the General Accounting Office on the near-disaster at the Davis-Besse nuclear power plant is another expert assessment of NRC functioning (link).

David Lochbaum, Edwin Lyman, and Susan Stranahan fit the description of highly qualified independent scientists and observers, and their detailed case history of the Fukushima disaster provides a degree of insight into the workings of the NRC as well as the Japanese nuclear safety agency. Their book, Fukushima: The Story of a Nuclear Disaster, is jointly written by the authors under the auspices of the Union of Concerned Scientists, one of the best informed networks of nuclear experts we have in the United States. Lochbaum is director of the UCS Nuclear Safety Project and author of Nuclear Waste Disposal Crisis. The book provides a careful and scientific treatment of the unfolding of the Fukushima disaster hour by hour, and highlights the background errors that were made by regulators and owners in the design and operation of the Fukushima plant as well. The book makes numerous comparisons to the current workings of the NRC which permit a degree of assessment of the US regulatory agency.

In brief, Lochbaum and his co-authors appear to have a reasonably high opinion of the technical staff, scientists, and advisors who prepare recommendations for NRC consideration, but a low opinion of the willingness of the five commissioners to adopt costly recommendations that are strongly opposed by the nuclear industry. The authors express frustration that the nuclear safety agencies in both countries appear to have failed to have learned important lessons from the Fukushima disaster:
“The [Japanese] government simply seems in denial about the very real potential for another catastrophic accident.... In the United States, the NRC has also continued operating in denial mode. It turned down a petition requesting that it expand emergency evacuation planning to twenty-five miles from nuclear reactors despite the evidence at Fukushima that dangerous levels of radiation can extend at least that far if a meltdown occurs. It decided to do nothing about the risk of fire at over-stuffed spent fuel pools. And it rejected the main recommendation of its own Near-Term Task Force to revise its regulatory framework. The NRC and the industry instead are relying on the flawed FLEX program as a panacea for any and all safety vulnerabilities that go beyond the “design basis.” (kl 117)
They believe that the NRC is excessively vulnerable to influence by the nuclear power industry and to elected officials who favor economic growth over hypothetical safety concerns, with the result that it tends to err in favor of the economic interests of the industry.
Like many regulatory agencies, the NRC occupies uneasy ground between the need to guard public safety and the pressure from the industry it regulates to get off its back. When push comes to shove in that balancing act, the nuclear industry knows it can count on a sympathetic hearing in Congress; with millions of customers, the nation’s nuclear utilities are an influential lobbying group. (36)
They note that the NRC has consistently declined to undertake more substantial reform of its approach to safety, as recommended by its own panel of experts. The key recommendation of the Near-Term Task Force (NTTF) was that the regulatory framework should be anchored in a more strenuous standard of accident prevention, requiring plant owners to address "beyond-design-basis accidents". The Fukushima earthquake and tsunami events were "beyond-design-basis"; nonetheless, they occurred, and the NTTF recommended that safety planning should incorporate consideration of these unlikely but possible events.
The task force members believed that once the first proposal was implemented, establishing a well-defined framework for decision making, their other recommendations would fall neatly into place. Absent that implementation, each recommendation would become bogged down as equipment quality specifications, maintenance requirements, and training protocols got hashed out on a case-by-case basis. But when the majority of the commissioners directed the staff in 2011 to postpone addressing the first recommendation and focus on the remaining recommendations, the game was lost even before the opening kickoff. The NTTF’s Recommendation 1 was akin to the severe accident rulemaking effort scuttled nearly three decades earlier, when the NRC considered expanding the scope of its regulations to address beyond-design accidents. Then, as now, the perceived need for regulatory “discipline,” as well as industry opposition to an expansion of the NRC’s enforcement powers, limited the scope of reform. The commission seemed to be ignoring a major lesson of Fukushima Daiichi: namely, that the “fighting the last war” approach taken after Three Mile Island was simply not good enough. (kl 253)
As a result, "regulatory discipline" (essentially the pro-business ideology that holds that regulation should be kept to a minimum) prevailed, and the primary recommendation was tabled. The issue was of great importance, in that it involved setting the standard of risk and accident severity for which the owner needed to plan. By staying with the lower standard, the NRC left the door open to the most severe kinds of accidents.

The NTTF task force also addressed the issue of "delegated regulation" (in which the agency defers to the industry in many issues of certification and risk assessment) (Here is the FAA's definition of delegated regulation; link.)
The task force also wanted the NRC to reduce its reliance on industry voluntary initiatives, which were largely outside of regulatory control, and instead develop its own “strong program for dealing with the unexpected, including severe accidents.” (252)
Other more detail-oriented recommendations were refused as well -- for example, a requirement to install reliable hardened containment vents in boiling water reactors, with a requirement that these vents should incorporate filters to remove radioactive gas before venting. 
But what might seem a simple, logical decision—install a $15 million filter to reduce the chance of tens of billions of dollars’ worth of land contamination as well as harm to the public—got complicated. The nuclear industry launched a campaign to persuade the NRC commissioners that filters weren’t necessary. A key part of the industry’s argument was that plant owners could reduce radioactive releases more effectively by using FLEX equipment.... In March 2013, they voted 3–2 to delay a requirement that filters be installed, and recommended that the staff consider other alternatives to prevent the release of radiation during an accident. (254)
The NRC voted against including the requirement of filters on containment vents, a decision that was based on industry arguments that the cost of the filters was excessive and unnecessary.

The authors argue that the NRC needs to significantly rethink its standards of safety and foreseeable risk.
What is needed is a new, commonsense approach to safety, one that realistically weighs risks and counterbalances them with proven, not theoretical, safety requirements. The NRC must protect against severe accidents, not merely pretend they cannot occur. (257)
Their recommendation is to make use of an existing and rigorous plan for reactor safety incorporating the results of "severe accident mitigation alternatives" (SAMA) analysis already performed -- but largely disregarded.

However, they are not optimistic that the NRC will be willing to undertake these substantial changes that would significantly enhance safety and make a Fukushima-scale disaster less likely. Reporting on a post-Fukushima conference sponsored by the NRC, they write:
But by now it was apparent that little sentiment existed within the NRC for major changes, including those urged by the commission’s own Near-Term Task Force to expand the realm of “adequate protection.”
Lochbaum and his co-authors also make an intriguing series of points about the use of modeling and simulation in the effort to evaluate safety in nuclear plants. They agree that simulation methods are an essential part of the toolkit for nuclear engineers seeking to evaluate accident scenarios; but they argue that the simulation tools currently available (or perhaps ever available) fall far short of the precision sometimes attributed to them. So simulation tools sometimes give a false sense of confidence in the existing safety arrangements in a particular setting.
Even so, the computer simulations could not reproduce numerous important aspects of the accidents. And in many cases, different computer codes gave different results. Sometimes the same code gave different results depending on who was using it. The inability of these state-of-the-art modeling codes to explain even some of the basic elements of the accident revealed their inherent weaknesses—and the hazards of putting too much faith in them. (263)
In addition to specific observations about the functioning of the NRC the authors identify chronic failures in the nuclear power system in Japan that should be of concern in the United States as well. Conflict of interest, falsification of records, and punishment of whistleblowers were part of the culture of nuclear power and nuclear regulation in Japan. And these problems can arise in the United States as well. Here are examples of the problems they identify in the Japanese nuclear power system; it is a valuable exercise to attempt to determine whether these issues arise in the US regulatory environment as well.

Non-compliance and falsification of records in Japan
Headlines scattered over the decades built a disturbing picture. Reactor owners falsified reports. Regulators failed to scrutinize safety claims. Nuclear boosters dominated safety panels. Rules were buried for years in endless committee reviews. “Independent” experts were financially beholden to the nuclear industry for jobs or research funding. “Public” meetings were padded with industry shills posing as ordinary citizens. Between 2005 and 2009, as local officials sponsored a series of meetings to gauge constituents’ views on nuclear power development in their communities, NISA encouraged the operators of five nuclear plants to send employees to the sessions, posing as members of the public, to sing the praises of nuclear technology. (46)
The authors do not provide evidence about similar practices in the United States, though the history of the Davis-Besse nuclear plant in Ohio suggests that similar things happen in the US industry. Charles Perrow treats the Davis-Besse near-disaster in a fair amount of detail; link. Descriptions of the Davis-Besse nuclear incident can be found herehere, here, and here.
Conflict of interest
Shortly after the Fukushima accident, Japan’s Yomiuri Shimbun reported that thirteen former officials of government agencies that regulate energy companies were currently working for TEPCO or other power firms. Another practice, known as amaagari, “ascent to heaven,” spins the revolving door in the opposite direction. Here, the nuclear industry sends retired nuclear utility officials to government agencies overseeing the nuclear industry. Again, ferreting out safety problems is not a high priority.
Punishment of whistle-blowers
In 2000, Kei Sugaoka, a nuclear inspector working for GE at Fukushima Daiichi, noticed a crack in a reactor’s steam dryer, which extracts excess moisture to prevent harm to the turbine. TEPCO directed Sugaoka to cover up the evidence. Eventually, Sugaoka notified government regulators of the problem. They ordered TEPCO to handle the matter on its own. Sugaoka was fired. (47)
There is a similar story in the Davis-Besse plant history.

Factors that interfere with effective regulation

In summary: there appear to be several structural factors that make nuclear regulation less effective than it needs to be.

First is the fact of the political power and influence of the nuclear industry itself. This was a major factor in the background of the Chernobyl disaster as well, where generals and party officials pushed incessantly for rapid completion of reactors; Serhii Plokhy, Chernobyl: The History of a Nuclear Catastrophe. Lochbaum and his collaborators demonstrate the power that TEPCO had in shaping the regulations under which it built the Fukushima complex, including the assumptions that were incorporated about earthquake risk and tsunami risk. Charles Perrow demonstrates a comparable ability by the nuclear industry in the United States to influence the rules and procedures that govern their use of nuclear power as well (link). This influence permits the owners of nuclear power plants to influence the content of regulation as well as the systems of inspection and oversight that the agency adopts.

A related factor is the set of influences and lobbying points that come from the needs of the economy and the production pressures of the energy industry. (Interestingly enough, this was also a major influence on Soviet decision-making in choosing the graphite-moderated light water reactor for use at Chernobyl and numerous other plants in the 1960s; Serhii Plokhy, Chernobyl: The History of a Nuclear Catastrophe.)

Third is the fact emphasized by Charles Perrow that the NRC is primarily governed by Congress, and legislators are themselves vulnerable to the pressures and blandishments of the industry and demands for a low-regulation business environment. This makes it difficult for the NRC to carry out its role as independent guarantor of the health and safety of the public. Here is Perrow's description of the problem in The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters (quoting Lochbaum from a 2004 Union of Concerned Scientists report):
With utilities profits falling when the NRC got tough after the Time story, the industry not only argued that excessive regulation was the problem, it did something about what it perceived as harassment. The industry used the Senate subcommittee that controls the agency’s budget, headed by a pro-nuclear Republican senator from New Mexico, Pete Domenici. Using the committee’s funds, he commissioned a special study by a consulting group that was used by the nuclear industry. It recommended cutting back on the agency’s budget and size. Using the consultant’s report, Domenici “declared that the NRC could get by just fine with a $90 million budget cut, 700 fewer employees, and a greatly reduced inspection effort.” (italics supplied) The beefed-up inspections ended soon after the threat of budget cuts for the agency. (Mangels 2003) And the possibility for public comment was also curtailed, just for good measure. Public participation in safety issues once was responsible for several important changes in NRC regulations, says David Lochbaum, a nuclear safety engineer with the Union of Concerned Scientists, but in 2004, the NRC, bowed to industry pressure and virtually eliminated public participation. (Lochbaum 2004) As Lochbaum told reporter Mangels, “The NRC is as good a regulator as Congress permits it to be. Right now, Congress doesn’t want a good regulator.”  (The Next Catastrophe, kl 2799)
A fourth important factor is a pervasive complacency within the professional nuclear community about the inherent safety of nuclear power. This is a factor mentioned by Lochbaum:
Although the accident involved a failure of technology, even more worrisome was the role of the worldwide nuclear establishment: the close-knit culture that has championed nuclear energy—politically, economically, socially—while refusing to acknowledge and reduce the risks that accompany its operation. Time and again, warning signs were ignored and near misses with calamity written off. (kl 87)
This is what we might call an ideological or cultural factor, in that it describes a mental framework for thinking about the technology and the public. It is very real factor in decision-making, both within the industry and in the regulatory world. Senior nuclear engineering experts at major research universities seem to share the view that the public "fear" of nuclear power is entirely misplaced, given the safety record of the industry. They believe the technical problems of nuclear power generation have been solved, and that a rational society would embrace nuclear power without anxiety. For rebuttal to this complacency, see Rose and Sweeting's report in the Bulletin of the Atomic Scientists, "How safe is nuclear power? A statistical study suggests less than expected" (link). Here is the abstract to their paper:
After the Fukushima disaster, the authors analyzed all past core-melt accidents and estimated a failure rate of 1 per 3704 reactor years. This rate indicates that more than one such accident could occur somewhere in the world within the next decade. The authors also analyzed the role that learning from past accidents can play over time. This analysis showed few or no learning effects occurring, depending on the database used. Because the International Atomic Energy Agency (IAEA) has no publicly available list of nuclear accidents, the authors used data compiled by the Guardian newspaper and the energy researcher Benjamin Sovacool. The results suggest that there are likely to be more severe nuclear accidents than have been expected and support Charles Perrow’s “normal accidents” theory that nuclear power reactors cannot be operated without major accidents. However, a more detailed analysis of nuclear accident probabilities needs more transparency from the IAEA. Public support for nuclear power cannot currently be based on full knowledge simply because important information is not available.
Lee Clarke's book on planning for disaster on the basis of unrealistic models and simulations is relevant here. In Mission Improbable: Using Fantasy Documents to Tame Disaster Clarke argues that much of the planning currently in place for largescale disasters depends upon models, simulations, and scenario-building tools in which we should have very little confidence.

The complacency about nuclear safety mentioned here makes safety regulation more difficult and, paradoxically, makes the safe use of nuclear power more unlikely. Only when the risks are confronted with complete transparency and honesty will it be possible to design regulatory systems that do an acceptable job of ensuring the safety and health of the public.

In short, Lochbaum and his co-authors seem to provide evidence for the conclusion that the NRC is not in a position to perform its primary function: to establish a rational and scientifically well grounded set of standards for safe reactor design and operation. Further, its ability to enforce through inspection seems impaired as well by the power and influence the nuclear industry can deploy through Congress to resist its regulatory efforts. Good expert knowledge is canvassed through the NRC's processes; but the policy recommendations that flow from this scientific analysis are all too often short-circuited by the ability of the industry to fend off new regulatory requirements. Lochbaum's comment quoted by Perrow above seems all too true: “The NRC is as good a regulator as Congress permits it to be. Right now, Congress doesn’t want a good regulator.” 

It is very interesting to read the transcript of a 2014 hearing of the Senate Committee on Environment and Public Works titled "NRC'S IMPLEMENTATION OF THE FUKUSHIMA NEAR-TERM TASK FORCE RECOMMENDATIONS AND OTHER ACTIONS TO ENHANCE AND MAINTAIN NUCLEAR SAFETY" (link). Senator Barbara Boxer, California Democrat and chair of the committee, opened the meeting with these words:
Although Chairman Macfarlane said, when she announced her resignation, she had assured that ‘‘the agency implemented lessons learned from the tragic accident at Fukushima.’’ She said, ‘‘the American people can be confident that such an accident will never take place here.’’

I say the reality is not a single one of the 12 key safety recommendations made by the Fukushima Near-Term Task Force has been implemented. Some reactor operators are still not in compliance with the safety requirements that were in place before the Fukushima disaster. The NRC has only completed its own action 4 of the 12 task force recommendations.
This is an alarming assessment, and one that is entirely in accord with the observations made by Lochbaum above.

Saturday, July 27, 2019

Soviet nuclear disasters: Kyshtym


The 1986 meltdown of reactor number 4 at the Chernobyl Nuclear Power Plant was the greatest nuclear disaster the world has yet seen. Less well known is the Kyshtym disaster in 1957, which resulted in a massive release of radioactive material in the Eastern Ural region of the Soviet Union. This was a catastrophic underground explosion at a nuclear storage facility near the Mayak power plant in the Eastern Ural region of the USSR. Information about the disaster was tightly restricted by Soviet authorities, with predictably bad consequences.

Zhores Medvedev was one of the first qualified scientists to provide information and hypotheses about the Kyshtym disaster. His book Nuclear Disaster in the Urals was written while he was in exile in Great Britain and appeared in 1980. It is fascinating to learn that his reasoning is based on his study of ecological, biological, and environmental research done by Soviet scientists between 1957 and 1980. Medvedev was able to piece together the extent of contamination and the general nature of the cause of the event from basic information about radioactive contamination in lakes and streams in the region included incidentally in scientific reports from the period.

It is very interesting to find that scientists in the United States were surprisingly skeptical about Medvedev's assertions. W. Stratton et al published a review analysis in Science in 1979 (link) that found Medvedev's reasoning unpersuasive.
A steam explosion of one tank is not inconceivable but is most improbable, because the heat generation rate from a given amount of fission products is known precisely and is predictable. Means to dissipate this heat would be a part of the design and could be made highly reliable. (423)
They offer an alternative hypothesis about any possible radioactive contamination in the Kyshtym region -- the handful of multimegaton nuclear weapons tests conducted by the USSR in the Novaya Zemlya area.
We suggest that the observed data can be satisfied by postulating localized fallout (perhaps with precipitation) from explosion of a large nuclear weapon, or even from more than one explosion, because we have no limits on the length of time that fallout continued. (425)
And they consider weather patterns during the relevant time period to argue that these tests could have been the source of radiation contamination identified by Medvedev. Novaya Zemlya is over 1000 miles north of Kyshtym (20 degrees of latitude). So the fallout from the nuclear tests may be a possible alternative hypothesis, but it is farfetched. They conclude:
We can only conclude that, though a radiation release incident may well be supported by the available evidence, the magnitude of the incident may have been grossly exaggerated, the source chosen uncritically, and the dispersal mechanism ignored. Even so we find it hard to believe that an area of this magnitude could become contaminated and the event not discussed in detail or by more than one individual for more than 20 years. (425)
The heart of their skepticism depends on an entirely indefensible assumption: that Soviet science, engineering, and management were entirely capable of designing and implementing a safe system for nuclear waste storage. They were perhaps right about the scientific and engineering capabilities of the Soviet system; but the management systems in place were woefully inadequate. Their account rested on an assumption of straightforward application of engineering knowledge to the problem; but they failed to take into account the defects of organization and oversight that were rampant within Soviet industrial systems. And in the end the core of Medvedev's claims have been validated.

Another official report was compiled by Los Alamos scientists, released in 1982, that concluded unambiguously that Medvedev was mistaken, and that the widespread ecological devastation in the region resulted from small and gradual processes of contamination rather than a massive explosion of waste materials (link). Here is the conclusion put forward by the study's authors:
What then did happen at Kyshtym? A disastrous nuclear accident that killed hundreds, injured thousands, and contaminated thousands of square miles of land? Or, a series of relatively minor incidents, embellished by rumor, and severely compounded by a history of sloppy practices associated with the complex? The latter seems more highly probable.
So Medvedev is dismissed.

After the collapse of the USSR voluminous records about the Kyshtym disaster became available from secret Soviet files, and those records make it plain that US scientists badly misjudged the nature of the Kyshtym disaster. Medvedev was much closer to the truth than were Stratton and his colleagues or the authors of the Los Alamos report.

A scientific report based on Soviet-era documents that were released after the fall of the Soviet Union appeared in the Journal of Radiological Protection in 2017 (A V Akleyev et al 2017; link). Here is their brief description of the accident:
Starting in the earliest period of Mayak PA activities, large amounts of liquid high-level radioactive waste from the radiochemical facility were placed into long-term controlled storage in metal tanks installed in concrete vaults. Each full tank contained 70–80 tons of radioactive wastes, mainly in the form of nitrate compounds. The tanks were water-cooled and equipped with temperature and liquid-level measurement devices. In September 1957, as a result of a failure of the temperature-control system of tank #14, cooling-water delivery became insufficient and radioactive decay caused an increase in temperature followed by complete evaporation of the water, and the nitrate salt deposits were heated to 330 °C–350 °C. The thermal explosion of tank #14 occurred on 29 September 1957 at 4:20 pm local time. At the time of the explosion the activity of the wastes contained in the tank was about 740 PBq [5, 6]. About 90% of the total activity settled in the immediate vicinity of the explosion site (within distances less than 5 km), primarily in the form of coarse particles. The explosion gave rise to a radioactive plume which dispersed into the atmosphere. About 2 × 106 Ci (74PBq) was dispersed by the wind (north-northeast direction with wind velocity of 5–10 m s−1) and caused the radioactive trace along the path of the plume [5]. Table 1 presents the latest estimates of radionuclide composition of the release used for reconstruction of doses in the EURT area. The mixture corresponded to uranium fission products formed in a nuclear reactor after a decay time of about 1 year, with depletion in 137Cs due to a special treatment of the radioactive waste involving the extraction of 137Cs [6]. (R20-21)
Here is the region of radiation contamination (EURT) that Akleyev et al identify:

This region represents a large area encompassing 23,000 square kilometers (8,880 square miles). Plainly Akleyev et al describe a massive disaster including a very large explosion in an underground nuclear waste storage facility, large-scale dispersal of nuclear materials, and evacuation of population throughout a large region. This is very close to the description provided by Medvedev.

A somewhat surprising finding of the Akleyev study is that the exposed population did not show dramatically worse health outcomes and mortality relative to unexposed populations. For example, "Leukemia mortality rates over a 30-year period after the accident did not differ from those in the group of unexposed people" (R30). Their epidemiological study for cancers overall likewise indicates only a small effect of accidental radiation exposure on cancer incidence:
The attributable risk (AR) of solid cancer incidence in the EURTC, which gives the proportion of excess cancer cases out of the sum of excess and baseline cases, calculated according to the linear model, made up 1.9% over the whole follow-up period. Therefore, only 27 cancer cases out of 1426 could be associated with accidental radiation exposure of the EURT population. AR is highest in the highest dose groups (250–500 mGy and >500 mGy) and exceeds 17%.
So why did the explosion occur? James Mahaffey examines the case in detail in Atomic Accidents: A History of Nuclear Meltdowns and Disasters: From the Ozark Mountains to Fukushima. Here is his account:
In the crash program to produce fissile bomb material, a great deal of plutonium was wasted in the crude separation process. Production officials decided that instead of being dumped irretrievably into the river, the plutonium that had failed to precipitate out, remaining in the extraction solution, should be saved for future processing. A big underground tank farm was built in 1953 to hold processed fission waste. Round steel tanks were installed in banks of 20, sitting on one large concrete slab poured at the bottom of an excavation, 27 feet deep. Each bank was equipped with a heat exchanger, removing the heat buildup from fission-product decay using water pipes wrapped around the tanks. The tanks were then buried under a backfill of dirt. The tanks began immediately to fill with various waste solutions from the extraction plant, with no particular distinction among the vessels. The tanks contained all the undesirable fission products, including cobalt-60, strontium-90, and cesium-137, along with unseparated plutonium and uranium, with both acetate and nitrate solutions pumped into the same volume. One tank could hold probably 100 tons of waste product. 
In 1956, a cooling-water pipe broke leading to one of the tanks. It would be a lot of work to dig up the tank, find the leak, and replace the pipe, so instead of going to all that trouble, the engineers in charge just turned off the water and forgot about it. 
A year passed. Not having any coolant flow and being insulated from the harsh Siberian winter by the fill dirt, the tank retained heat from the fission-product decay. Temperature inside reached 660 ° Fahrenheit, hot enough to melt lead and cast bullets. Under this condition, the nitrate solutions degraded into ammonium nitrate, or fertilizer, mixed with acetates. The water all boiled away, and what was left was enough solidified ANFO explosive to blow up Sterling Hall several times, being heated to the detonation point and laced with dangerous nuclides. [189] 
Sometime before 11: 00 P.M. on Sunday, September 29, 1957, the bomb went off, throwing a column of black smoke and debris reaching a kilometer into the sky, accented with larger fragments burning orange-red. The 160-ton concrete lid on the tank tumbled upward into the night like a badly thrown discus, and the ground thump was felt many miles away. Residents of Chelyabinsk rushed outside and looked at the lighted display to the northwest, as 20 million curies of radioactive dust spread out over everything sticking above ground. The high-level wind that night was blowing northeast, and a radioactive plume dusted the Earth in a tight line, about 300 kilometers long. This accident had not been a runaway explosion in an overworked Soviet production reactor. It was the world’s first “dirty bomb,” a powerful chemical explosive spreading radioactive nuclides having unusually high body burdens and guaranteed to cause havoc in the biosphere. The accidentally derived explosive in the tank was the equivalent of up to 100 tons of TNT, and there were probably 70 to 80 tons of radioactive waste thrown skyward. (KL 5295)
So what were the primary organizational and social causes of this disaster? One is the haste created in nuclear design and construction created by Stalin's insistence on moving forward the Soviet nuclear weapons program as rapidly as possible. As is evident in the Chernobyl case as well, the political pressures on engineers and managers that followed from these political priorities often led to disastrous decisions and actions. A second is the institutionalized system of secrecy that surrounded industry generally, the military specifically, and the nuclear industry most especially. A third is the casual attitude taken by Soviet officials towards the health and wellbeing of the population. And a final cause highlighted by Mahaffey's account is the low level of attention given at the plant level to safety and maintenance of highly risky facilities. Stratton et al based their analysis on the fact that the heat-generating characteristics of nuclear waste were well understood and that effective means existed for controlling those risks. That may be, but what they failed to anticipate is that these risks would be fundamentally disregarded on the ground and in the supervisory system above the Kyshtym reactor complex.

(It is interesting to note that Mahaffey himself underestimates the amount of information that is now available about the effects of the disaster. He writes that "studies of the effects of this disaster are extremely difficult, as records do not exist, and previous residents are hard to track down" (kl 5330). But the Akleyev study mentioned above provides extensive health details about the affected population made possible as a result of data collected during Soviet times and concealed.)