Improvement of the Quality of Risk Information for Regulatory Decisionmaking

May 16, 2003

The Honorable Nils J. Diaz
Chairman
U.S. Nuclear Regulatory Commission
Washington, DC 20555-0001

 
SUBJECT: IMPROVEMENT OF THE QUALITY OF RISK INFORMATION FOR REGULATORY DECISIONMAKING

Dear Chairman Diaz:

In a March 31, 2003, Staff Requirements Memorandum (SRM) on risk-informed changes to 10 CFR 50.46, the Commission stated that "the PRA should be a level 2 internal- and external-initiating event all mode PRA, which has been subjected to a peer review process and submitted to and endorsed by the NRC." Similarly, in an SRM dated March 28, 2003, the Commission directed the staff to "ask for specific comment in the Statements of Consideration on whether NRC should amend 50.69(c)(1)(i) to require a comprehensive high quality PRA. For example, this PRA should be a level 2 internal- and external-initiating event all mode PRA, which has been subjected to a peer review process and submitted to and endorsed by the NRC."

In this report, we focus on several aspects of Probabilistic Risk Assessment (PRA) methodology and practice that need to be addressed to achieve such comprehensive high-quality PRAs. We limit our discussion to the PRA methodology needed for the calculation of core damage frequency (CDF) and the estimation of large early release frequency (LERF) consistent with Regulatory Guide (RG) 1.174 and do not address issues unique to Level 2 PRA. We have had the benefit of the results of a study performed for us by K.N. Fleming of Technology Insights (Reference 1), as well as of the documents referenced.

CONCLUSIONS AND RECOMMENDATIONS

  1. Completeness of risk information requires that PRAs address low-power and shutdown (LPSD) modes and "external" events, such as fires and earthquakes, in addition to power operations.
  2. Guidance should be developed on how licensees and peer-review teams should consider operating experience in order to improve PRA completeness.
  3. The assessment of uncertainties should address model uncertainties. Guidance for the quantitative evaluation of model uncertainties should be developed.

DISCUSSION

Reference 1 presents the results of about 20 interviews with members of the NRC staff and selected representatives of the nuclear industry. The NRC staff members included senior management and staff from the Office of Nuclear Regulatory Research (RES) and the Office of Nuclear Reactor Regulation (NRR). The subject of the interviews was risk-informed decisionmaking.

The study found that most staff interviewees believe that the reluctance of the industry to improve the scope and quality of the PRAs is a major impediment to the advancement of risk-informed regulation. The areas of difficulty include both the use of limited-scope PRAs and the lack of completeness within a specified scope. Even for risk contributors that were treated, incompleteness of treatment was cited as an issue.

A further observation of Reference 1 is that, while valid technical arguments can be made to justify limited-scope PRA model for some applications, resources must be expended by both the licensee and the NRC to determine the validity of decisions that are based on an incomplete model. It is reasonable to ask whether these burdens are comparable to the effort needed to develop a full-scope PRA.

Our review of safety evaluations of licensee risk-informed submittals has revealed that the staff does include consideration of all modes of operation as well as "external" events. When the licensees submit incomplete PRAs (e.g., missing the LPSD part) or use bounding analyses, typically for some external events, the staff has to account for the missing PRA elements subjectively, as allowed by the "integrated decisionmaking process" of RG 1.174 (Reference 2).

These subjective evaluations do not necessarily lead to conservative decisions. Reference 1 points out that, when bounding analyses are used for external events, some risk contributors may not be identified. For example, there are some risk-significant sequences that involve combinations of failures from fires and other events independent of the fire, i.e., a fire may disable one train of a safety system and another train may be unavailable due to other causes. It is unlikely that a bounding analysis for fires would identify such sequences.

Certain risk-informed applications, e.g., risk informing the special treatment requirements require the use of importance measures (e.g., Fussell-Vesely and Risk Achievement Worth). These are global measures of risk that are strongly affected by the scope and quality of the PRA. As stated in our report dated February 11, 2000 (Reference 3), incomplete assessments of risk contributions from LPSD operations, fires, and human performance distort the importance measures, undermining confidence in the risk categorization of structures, systems, and components (SSCs).

All-mode PRAs permit the risk characterization of SSCs that are used only in shutdown or low-power modes, such as components of residual heat removal systems. In addition, all-mode PRAs facilitate cycle risk optimization. For example, by comparing the risk contributions of diesel generator maintenance during shutdown and during operation, plants with internal events PRAs and LPSD PRAs have shown that on-line diesel generator maintenance reduces overall cycle risk, even though it may slightly increase risk during power operation.

In addition to the PRA scope, completeness also refers to the set of accident sequences within scope. Reference 1 notes that, in general, PRAs do not make use of experience gained over the years in identifying sequences that should be analyzed. In addition, operating experience should be reviewed.

As noted in our report dated October 11, 2000 (Reference 4), RES has been issuing reports that contain evaluations of actual plant performance in terms of initiating-event frequencies and reliabilities of critical plant systems, as well as comparisons with corresponding data used in PRAs. Augmented Inspection Team reports provide detailed evaluations of major incidents. The Accident Sequence Precursor (ASP) program identifies significant accident sequences that actually have occurred and draws relevant conclusions. Generic Safety Issues (GSIs) are an additional source of information that should be considered in upgrading PRAs.

Unfortunately, this wealth of useful information does not appear to be widely used by PRA practitioners. Reference 1 suggests that as many as 20% of events evaluated by the ASP program involve initiating events and accident sequences not modeled in existing PRAs. Although PRAs use the statistical information from past experience in the estimation of failure rates, the sequences of events that actually have occurred are not generally utilized. The reasonableness of PRA results is often judged by comparing them with the results of other PRAs for similar plants. Although such comparisons are useful, we believe that analyses of operating experience such as the RES reports should be utilized to a greater extent. The staff should prepare guidance to the licensees and peer-review teams to make sure that PRAs benefit from this experience.

The Reactor Safety Study (Reference 5) developed probability distributions for parameters such as failure rates and initiating-event frequencies. This precedent, combined with the fact that parameter uncertainties are easier to deal with than model uncertainties, has led to the unfortunate, yet widely held, belief that uncertainty analysis is synonymous with parameter uncertainty evaluation. In addition, it has been found that the principal PRA results are fairly insensitive to parameter uncertainties, thus leading to the belief that quantifying such uncertainties is an unnecessary burden.

However, models that are included in the PRAs can be important sources of uncertainty. For example, there are several models for human performance during accidents that are based on different assumptions and analytical approaches. Human reliability experts have not yet reached consensus on what assumptions are appropriate. Using only one of these models yields results whose uncertainties are unknown, since the use of another model could yield different results. Yet this model uncertainty is rarely considered.

The Ispra Research Center of the European Union organized a benchmark exercise in which 15 teams from 11 countries used a number of human reliability analysis (HRA) models available at the time to estimate the probability of the crew not responding correctly to a transient (Reference 8). The results produced by the teams using the same HRA model differed by orders of magnitude. The results produced by a single team using a number of HRA models also differed by orders of magnitude. Although these results are fairly old now, we believe that they are still representative of the model uncertainties present in HRA.

Several other examples of the impact of model uncertainties are presented in Reference 9. In one PRA, the dominant model uncertainties resulted from the reactor coolant pump (RCP) seal loss-of-coolant accident (LOCA) timing and operator recovery possibilities. In another, they were due to the RCP seal LOCA timing again and the heating, ventilation, and air conditioning (HVAC) success criteria. The authors stated that, in all cases, the CDF was affected significantly by these uncertainties.

The staff has recognized that model uncertainty must be addressed by decisionmakers. Draft Regulatory Guide DG-1122 (Reference 10) includes the following statement in its description of the technical elements of a PRA: "The sensitivity of the model results to model boundary conditions and other key assumptions is evaluated using sensitivity analyses to look at key assumptions both individually and in logical combinations." RG 1.174 states that uncertainties due to incompleteness and model assumptions should be evaluated.

Most licensees have not included a systematic treatment of uncertainties in their PRAs. A systematic treatment would include analyses of parametric uncertainties, sensitivity studies to identify the important model uncertainties, and quantification of the latter.

Tools for performing analyses of parametric uncertainties are readily available and are included in most of the widely used PRA software. The disciplined use of sensitivity studies to address model uncertainties is not as well understood. Developing guidance for quantifying model uncertainty is not infeasible. Such an effort would build on past practice and the literature. For example, NUREG-1150 (Reference 11) quantified the probabilities of alternative assumptions in severe accident assessments by eliciting expert opinions. Since NUREG-1150, other methods have been developed that are not as resource intensive (References 9 and 12). Furthermore, RES has sponsored a workshop in which a number of ideas and methods for handling model uncertainties have been proposed and debated (Reference 13).

More guidance regarding sensitivity and uncertainty analyses would contribute greatly to confidence in risk-informed regulatory decisionmaking. Such guidance should include a clear discussion of the roles of sensitivity and uncertainty analyses, as well as practical procedures for performing these analyses. It should address not only how uncertainties should be treated in the PRA, but, also, how they impact decisionmaking with examples to show the pitfalls if uncertainties are inadequately addressed.

Sincerely,

/RA/

Mario V. Bonaca
Chairman

References:

  1. U.S. Nuclear Regulatory Commission, NUREG/CR-6813, "Issues and Recommendations for Advancement of PRA Technology in Risk Informed Decision Making," Technology Insights, April 2003.
  2. U. S. Nuclear Regulatory Commission, Regulatory Guide 1.174, "An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Current Licensing Basis," June 1998.
  3. Report dated February 11, 2000, from Dana A. Powers, Chairman, ACRS, to Richard A. Meserve, Chairman, U. S. Nuclear Regulatory Commission, Subject: Importance Measures Derived from Probabilistic Risk Assessments.
  4. Report dated October 11, 2000, from Dana A. Powers, Chairman, ACRS, to Richard A. Meserve, Chairman, U. S. Nuclear Regulatory Commission, Subject: Union of Concerned Scientists Report, "Nuclear Plant Risk Studies: Failing the Grade."
  5. U. S. Nuclear Regulatory Commission, WASH-1400, "Reactor Safety Study, An Assessment of Accident Risks in U. S. Nuclear Power Plants," (NUREG-75/014), October 1975.
  6. G. Apostolakis and S. Kaplan, "Pitfalls in Risk Calculations," Reliability Engineering, Vol. 2, pp. 135-145, 1981.
  7. Pickard, Lowe, and Garrick, Inc., Westinghouse Electric Corporation, Fauske & Associates, Inc., "Zion Probabilistic Safety Study," prepared for Commonwealth Edison Company, Chicago, 1981.
  8. A. Poucet, "The European Benchmark Exercise on Human Reliability Analysis," Presented at the American Nuclear Society International Topical Meeting on Probability, Reliability, and Safety Assessment, PSA '89, Pittsburgh, PA, April 2-7, 1989.
  9. D. Bley, S. Kaplan, and D. Johnson, "The Strengths and Limitations of PSA: Where We Stand," Reliability Engineering and System Safety, Vol. 38, pp. 3-26, 1992.
  10. U. S. Nuclear Regulatory Commission, Draft Regulatory Guide DG-1122, "An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk-Informed Activities," November 2002.
  11. U. S. Nuclear Regulatory Commission, NUREG-1150, "Severe Accident Risks: An Assessment for Five US Nuclear Power Plants," December 1990.
  12. U. S. Nuclear Regulatory Commission, NUREG/CR-6372, Volumes 1 and 2, "Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts," April 1997.
  13. U. S. Nuclear Regulatory Commission, NUREG/CP-0138, "Proceedings of Workshop on Model Uncertainty: Its Characterization and Quantification," Annapolis, MD, October 20-22, 1993, Editors: A. Mosleh, N. Siu, C. Smidts, and C. Lui, 1994.

Page Last Reviewed/Updated Monday, August 15, 2016