Proposed ASME Standard for Probabilistic Risk Assessment for Nuclear Power Plant Applications (Phase 1)
March 25, 1999
Dr. William D. Travers Executive Director for Operations U. S. Nuclear Regulatory Commission Washington, D.C. 20555-0001
Dear Dr. Travers:
SUBJECT: PROPOSED ASME STANDARD FOR PROBABILISTIC RISK ASSESSMENT FOR NUCLEAR POWER PLANT APPLICATIONS (PHASE 1)
During the 460th meeting of the Advisory Committee on Reactor Safeguards, March 10-13, 1999, we met with representatives of the American Society of Mechanical Engineers (ASME) Committee on Nuclear Risk Management (CNRM) to discuss the proposed Standard for Probabilistic Risk Assessment (PRA) for Nuclear Power Plant Applications (Phase 1). The purpose of this Standard is to provide a means to ensure that the technical quality of PRAs is sufficient to support the regulatory review and approval of licensee risk-informed applications. We also had the benefit of the documents referenced.
Conclusions and Recommendations
- The proposed Standard has the potential of being very useful to both the industry and the NRC. Although additional work remains, the overall approach to defining necessary PRA requirements is good.
- Subsection 3.5 on the use of expert judgment and the associated nonmandatory guidance in Appendix A are inconsistent with other parts of the Standard and should be revised. Subsection 3.5 should identify the major issues involving the use of expert opinion in a PRA and not focus on a particular approach.
- We agree with the CNRM decision to move Section 7 to the beginning of the Standard to present the risk assessment application process early in the document.
- Consideration should be given in the Standard to recommending participatory peer review throughout the development or application of the PRA in preference to a posteriori review.
The move toward a risk-informed regulatory system has increased awareness of the need to examine the quality of PRA methodologies. Risk information used for regulatory decisions must be based on credible models and methods.
The lack of confidence in the quality of PRAs will impede their use in the regulatory process. For example, the Individual Plant Examination (IPE) Insights Report (NUREG-1560) showed that there is variability in PRA results that can be attributed to different analytical tools used by licensees. On the basis of its review of licensee IPEs, the staff determined that assumptions used by some licensees were unacceptable and requested those licensees to improve their analyses. The development of a Standard that defines the necessary and minimum requirements for acceptable PRA quality is, therefore, essential.
Developing this Standard is not a straightforward process. If the Standard is too prescriptive, it could impede the further development and refinement of PRA models. On the other hand, simply listing all the methods and models that analysts have used or proposed in the past is not helpful because it presents all such tools as being equally credible or useful when, in fact, experience has shown that they are not.
We believe that the CNRM, who developed the proposed Standard, has established an appropriate balance between specificity and flexibility. The proposed Standard provides requirements that the CNRM believes are necessary for a quality PRA. Although there are references to methods in which there is broad consensus on their appropriateness, the CNRM has wisely refrained from being overly prescriptive in areas where the choice of methods is less clear. Because the actual methods for satisfying the requirements are not prescribed, merely meeting the requirements does not guarantee that a PRA will be of acceptable quality. Thus, the Standard also requires a peer review process to ensure acceptable quality. We agree with the CNRM that a robust peer review process is at present the best way to assess quality. Consideration should be given in the Standard to recommending participatory peer review throughout the development or application of the PRA in preference to just a review after completion of the work.
An exception to the CNRM decision not to specify methods is the treatment of expert judgment. Expert judgment has proven to be a ubiquitous element of modern PRAs for nuclear power plants. Overall, the proposed treatment of expert judgment in the Standard and in the nonmandatory Appendix A touches on nearly all the points that are needed. It puts an unwarranted emphasis on a particular approach to expert judgment. Subsection 3.5 should be revised to be consistent with the remainder of the Standard. Also, since it is not common practice to employ formal expert judgment methods in Level 1 PRAs, a discussion of the conditions requiring such treatment, with examples, would be very useful.
Subsection 7.5 requires that the users determine whether the scope and level of detail of the Standard are sufficient for an application and to provide a technical basis for this determination. Additional guidance should be provided in the Standard to clarify what is expected of the users.
To date, the work done to develop the proposed Standard and associated guidance is commendable. The Standard, when integrated with other industry and NRC initiatives, should greatly enhance progress toward risk-informed nuclear operations and regulatory decisionmaking. We applaud the staff for initiating this effort and for actively participating in the working committees.
We offer detailed comments in the attachment to this letter for the benefit of the CNRM in developing the proposed final version of the Standard and the NRC staff in considering possible endorsement. We look forward to reviewing the proposed final Standard following the reconciliation of public comments.
Dana A. Powers
- American Society of Mechanical Engineers, ASME RA-S-1999 Edition Draft #10, "Standard for Probabilistic Risk Assessment for Nuclear Power Plant Applications," draft released for public comment, dated February 1, 1999.
- American Society of Mechanical Engineers, "White Paper and Guidance for Reviewers of the Draft ASME Standard for Probabilistic Risk Assessment for Nuclear Power Plant Applications," received February 8, 1999.
- U.S. Nuclear Regulatory Commission, NUREG-1560, "Individual Plant Examination Program: Perspectives on Reactor Safety and Plant Performance," Vols. 1-3, December 1997.
- U.S. Nuclear Regulatory Commission, NUREG-1150, "Severe Accident Risks - An Assessment for Five U.S. Nuclear Power Plants," December 1990.
- U.S. Nuclear Regulatory Commission, Regulatory Guide 1.174, "An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis," July 1998.
Attachment: As Stated- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ATTACHMENT Detailed Comments on Proposed ASME Standard for PRA for Nuclear Power Plant Applications (Phase1)
Subsection 1.1 states that the Standard sets forth criteria and methods for developing and applying PRA. It should be made clear that the emphasis is on criteria and that particular methods are not prescribed.
Section 2 requires a thorough review. Considering the broad range of potential applications for this Standard, close scrutiny should be given to ensuring that the definitions are consistent with generally accepted reactor and risk terminology and that terminology used in each section of the Standard is appropriately addressed.
Many of the listed definitions are not needed. For example, there is no need to describe a mathematical method such as Monte Carlo simulation. Similarly, there is no need to define a "severe accident." The inclusion of the words "beyond design basis" in the definition is not appropriate.
Some of the listed definitions are not useful. For example, an "importance measure" is called a mathematical expression that defines a quantity of interest.
Several of the listed definitions are inaccurate or incorrect. Examples of the former are the definitions of "station blackout," "core damage frequency," "unavailability," and "cut sets." An example of the latter is the definition of the "failure rate."
Many terms in the text, which should be included in the definitions, are not defined in Section 2. Examples are: EOPs, I&C, ECCS, safety-related SSCs, aleatory and epistemic uncertainties, and single-failure criterion.
"Internal Flooding Analysis" is located in the wrong place in Fig. 3.1-1, "Technical Elements of a PRA Model."
3.2 Plant Familiarization
Page 18: An important example of the plant familiarization that should be made explicit is crew performance on simulators during known, generic, time-critical sequences. This provides an appropriate understanding of man-machine interaction.
3.3.1 Initiating Event Analysis
A list of the initiating events that have been used in PRAs should be included with appropriate guidance.
3.3.2 Sequence Development
The explicit description of conditional split fractions and of fault tree linking is appropriate because they are established and accepted approaches. Similarly, a portion of the discussion on event sequence diagrams and system dependency matrices should be removed from the nonmandatory Appendix A and relocated into the main body of the Standard.
3.3.3 Success Criteria
Page 23: The list of high-level functions should also include neutronic shutdown.
Page 23: Criteria resulting from neutronic analyses should be added to the list of requirements.
Page 23: The statement that bounding analyses can be used conflicts with Sub- paragraph 220.127.116.11, "Use of Realistic Success Criteria."
Page 23: Second column: specifies that "Bounding thermal-hydraulic analyses from the plant's SAR ... may be used when detailed analyses are not practical." This statement conflicts with the word "shall" used in Subparagraph 18.104.22.168 to ensure that realistic criteria are used.
3.3.4 Systems Analysis
The Standard should caution users that the calculation of the average unavailability of systems with redundant trains is not the product of the average unavailabilities of the individual trains. The time-averaging process introduces dependencies among train unavailabilities.
Page 32: The definition of the term "common-cause equipment failure" is not consistent with the definition provided in Section 2.
3.3.5 Data Analysis
Page 35: Although it is stated that the subjectivist approach to probability ought to be adopted, the Standard proceeds to discuss frequentist methods (Subparagraphs 22.214.171.124.4 and 126.96.36.199.5) that are inconsistent with this recommendation on the subjectivist approach.
Page 35: The Standard should be clarified to state when frequentist methods can be used and for what purpose. It should state that no PRA that has uncertainty analysis has considered these methods useful.
Page 40: The Standard should be clarified to state that the analysis of common- cause failures will require the use of generic data that are applicable to the specific plant under analysis.
3.3.6 Human Reliability Analysis
Page 45: The statement in Subparagraph 188.8.131.52.1 that recovery actions shall be limited to those actions for which some procedural guidance is provided or for which operators receive frequent training is inconsistent with the statement in 184.108.40.206 that extraordinary recovery actions that are not proceduralized shall be justified in the analysis.
3.3.8 Level 1 Quantification and Review of Results
Page 51: It is not clear what the CNRM means in Paragraph 220.127.116.11.2 by the exception stating, "If only point estimate quantification is completed, that point estimate shall be the mean." Does this mean that the "mean value" should be calculated using rigorous methods? What does the CNRM mean by "point estimates"?
Page 51: The requirement in Subparagraph 18.104.22.168.3 that model uncertainty be evaluated needs additional discussion. This evaluation can range from a quick estimate of uncertainty to the use of formal methods for expert opinion elicitation, as was done in NUREG-1150. Furthermore, additional guidance should be provided to clarify how the sensitivity studies should be done and how the results may be used.
3.3.9 Level 1 and Level 2 Interface
The determination of uncertainty should be given more discussion and a more prominent position in the Standard.
Page 55: The second example of accident sequence characteristics that should be considered refers to the "RCS pressure at core damage." This should be replaced with the "RCS pressure at the time of vessel penetration."
There should be a brief discussion on how to extract the Regulatory Guide 1.174 equivalent [large, early release frequency (LERF)] from the results of the detailed Level 2 PRA analysis.
3.4.2 Mapping of Level 1 Sequences
These risk assessments depend on the adequacy of the user's modeling of the physical response of the entire system to accident conditions. For example, whether or not a fan cooler fails due to internal waterhammer, or waterhammer in a piece of pipe to which it is connected, depends on many details of the piping geometry, ups and downs, water- storage tanks, starting transients of pumps when connected to the entire system of pipes, valves, tees and components, the rate of rise of containment temperature and humidity, etc. A technical analysis, including evaluation of uncertainties in modeling, plays the biggest role in assessing failure probability, rather than some characteristics of the device itself. The PRA is fragile if it is not based on the comprehensive analysis of system response. The Standard should reflect this dependence.
3.4.4 Radionuclide Release
Page 62: The last bullet calls for "the size distribution of radioactive material released in the form of an aerosol." Isn't this a time-dependent parameter? Is it to be specified as a function of time or an average?
Table 3.4.4-1 may be overkill with respect to the needs for determining LERF. Not all of the fission products are significant for LERF although they can be for a full Level 2 PRA analysis.
Page 64: Calls for including the release energy in the radionuclide source term. Is this the temperature, the enthalpy, the internal energy? Does it include radioactive energy?
Table 3.4.4-2 does not contain all of the key uncertainties. It should be expanded.
Page 65: Under the first example, the comment is made that "higher retention efficiencies were attributed to sequences involving low coolant system pressure than those involving high pressure." Is this correct? Was it not the inverse?
There is a need to discuss the release and effects of non-radioactive aerosols from the core.
3.5 Expert Judgment
What are the criteria for deciding when expert judgment must not be used in order to have a PRA of acceptable quality?
When are higher level treatments of expert judgment necessary to ensure that a PRA of acceptable levels of quality is produced? If there are not definable occasions when higher order treatment is needed to ensure adequate quality, why does not the Standard specify the minimum acceptable level of treatment and leave to guidance (i.e., in the Appendix) the discussion of higher levels of treatment that are not likely to ever be used?
The Standard requires that the problem to be addressed by the experts be specified in advance. Why is it not required that the experts be allowed to modify the problem? This is allowed in the nonmandatory guidance in Appendix A and would seem to be wise since the experts are very likely to know more about the issue than the PRA team.
The Standard requires that the degree of importance of the issue be determined, but provides no quantitative indication of the measure of importance. How can this be omitted if the goal is to have a PRA of adequate quality? The nonmandatory guidance provides some qualitative indications of importance that are sufficiently vague to ensure that all issues can be relegated either to the lowest or to the highest category of importance. Is it not possible to provide a specification of the measure of importance of an issue?
The Standard requires also that the complexity of the issue be determined. Here even the nonmandatory guidance is of no help. In the nonmandatory guidance, levels of complexity are described. In some cases these levels are described as "... levels of complexity of the issue under consideration..." (p.103-A-3.5.1[2.2]). But elsewhere these are described as "... levels of complexity in the use of experts..." (p.101-A) and it is apparent that this is the real meaning of the terms. What is the meaning of the "level of complexity of the issue" as specified in Paragraph 3.5.1(b)? What is the measure of complexity to be used?
Paragraph 3.5.3: The decision to use outside experts rather than relying on the collective wisdom of the PRA analysis team would seem to be a step in the direction of the quality of the PRA that may not be needed. The decision to do this is left completely to the judgment of the team. Surely, it must be known that there are issues that can be resolved properly for the purposes of producing a PRA of adequate quality only by using outside experts. Why are the characteristics of these issues not described?
Paragraph 3.5.4: A crucial step in the formulation of the expert judgment for the PRA is the aggregation of the various expert judgments. No requirements for this step are provided. How is this absence of any specification for such a crucial step consistent with the goal of having a PRA that has adequate quality?
Subparagraphs 22.214.171.124 and 126.96.36.199: Regarding Levels A, B, C, and D, there is no indication in the Standard of what these Levels are. The nonmandatory guidance provides some idea of what they are for those who choose to follow this guidance. What are the meanings of Levels A, B, C, and D for those who elect not to follow the nonmandatory guidance? People familiar with the formulation of standards should be added to the group preparing this Standard. Similar flaws arise throughout the discussion in these Subparagraphs. What are four levels of consensus? If the guidance in Appendix A is to be followed, the Standard should require it. Otherwise, revise the Standard so that it stands alone.
Why are requirements for documentation of the expert judgment process not mentioned by reference in Subsection 3.5?
The CNRM provides a listing of specific documentation requirements for a PRA that reflects, one-for-one, the listing of Risk Assessment Technical Requirements provided in Section 3. Although this listing is redundant, a concise listing of these documentation requirements would be helpful in avoiding diverse assessments of the Section 3 requirements. A careful review of Section 4 should follow the rewrite of Section 3. Also, where documentation requirements are stated in Section 4, a more specific statement of the kind of assessments necessary to satisfy these requirements should be useful, e.g., in the evaluation of the consequences of a residual heat removal system train failure, an adequate thermal-hydraulics analysis of system response is needed.
6.2 Review Team Personnel Qualifications
Define or describe the requirements for "indoctrination on the PRA process."
How were the various experience requirements established? e.g., "The team, collectively, shall have 15 years of experience in performing the activities related to the technical elements of the nuclear power plant PRA identified in Section 3 of this Standard."
The last paragraph is a documentation requirement, which may not belong in Subsection 6.2.
6.5 Review of Technical Elements
Consider a generic approach to defining when detailed or limited review is required. Consider reducing the redundancy of review guidance.
7.6 Determination of Scope and Level of Detail of Standard are Sufficient for Application
We are perplexed by the suggestion in Subsection 7.5 that the users determine whether the Standard is sufficient. Subsection 7.5 should be expanded to provide detailed guidance regarding the determination that the Standard is not sufficient to support a particular application and why alternative methods are needed. Also, a new section should be added to provide guidance on how users may recommend improvements to the Standard and for ASME to maintain and update the Standard.