OIG/98E-12 - REVIEW OF NRC'S DEVELOPMENT OF CORE RESEARCH CAPABILITIES
August 5, 1998
|MEMORANDUM TO:||L. Joseph Callan
Executive Director for Operations
|FROM:||Thomas J. Barchi
Assistant Inspector General for Audits
|SUBJECT:||REVIEW OF NRC'S DEVELOPMENT OF CORE RESEARCH CAPABILITIES|
Attached is the Office of the Inspector General's special evaluation report entitled "Review of NRC's Development of Core Research Capabilities." On July 30, 1998, we briefed senior officials in the Office of Nuclear Regulatory Research on the results of our special evaluation. This report does not contain recommendations, therefore, we did not solicit agency comments and will not track staff actions to address our observations.
Please contact me on 415-5915 if we can assist you further in this matter.
Attachment: As stated
- Report Synopsis
- Evaluation Results
- Identification of Core Areas Not Based on Pre-Approved Criteria
- Value of Weighted Criteria and Metrics Not Used
- Use of Criteria for Sunsetting Unknown
- Use and Usefulness of Core Research Capability Efforts Not Identified
As a result of the U.S. Nuclear Regulatory Commission's (NRC) rebaselining and direction-setting efforts in 1995 and 1996, the Commission tasked the Office of Nuclear Regulatory Research (RES) with developing a set of core research capabilities. It was the Commission's belief that NRC's research program should focus on areas with the highest safety and regulatory significance while maintaining the necessary technical capability. NRC's research efforts were envisioned to reflect a reduced program augmented by other resources that would be available as needed. The objectives of our evaluation were to assess RES' progress in developing core research capabilities and determine what uses are being made of this effort.
Based on our evaluation, we found that the agency preselected its core research areas and did not use Commission-approved criteria as intended. As a result, objective evaluations were not used to select core areas, the selection was so broad that it included all research areas, and this process, left unchecked, could continue in subsequent core reviews.
Our evaluation also found there was limited value in core capabilities criteria and associated metrics unless they were properly weighted. In a related matter, it was unclear to us how or if RES would use the criteria to determine when sunsetting conditions are reached and projects closed out.
Finally, we found that the NRC staff did not know how the core research capability efforts will be used by the Commission. The value of further core research capability analysis, as well as a broader core capability study for other offices, may be less than expected unless specific objectives as to the ultimate use are established.
Interest in core research capabilities began as part of the U.S. Nuclear Regulatory Commission's (NRC) Strategic Assessment and Rebaselining Project under Direction Setting Issue (DSI) 22, "Research," dated September 16, 1996. A subsumed issue was how, where, and in what areas of research should NRC maintain core capability. At that time, "capability" included both technical expertise and associated experimental facilities. "Core" meant a maintenance program consisting of the most critical expertise that NRC needed to have available to support its licensing and regulatory functions.
The Office of the Inspector General (OIG) performed this special evaluation to draw management's attention to current NRC efforts to develop core research capabilities. The objectives of our evaluation were to assess NRC's progress in identifying core research capabilities and determine what use will be made of this effort. (See Appendix I for more details on our objectives, scope, and methodology.) In light of a Commission meeting on core research capabilities scheduled for this month, we believe it is important to share our observations now on ways to strengthen the process.
In March 1997, the Commission tasked the Office of Nuclear Regulatory Research (RES) with developing a set of core research capabilities, starting with criteria to be approved in advance by the Commission. At that time, it was the Commission's belief that NRC's research program should focus on areas with the highest safety and regulatory significance.
In June 1997, the Commission approved RES' methodology and 14 criteria for evaluating core research capabilities as contained in SECY-97-075.(1) RES provided the final results of its core research capabilities analysis in April 1998, under SECY-98-076.(2) Since that time, RES has held several meetings with the Advisory Committee on Reactor Safeguards (ACRS). Also, ACRS provided results of its review of SECY-98-076 to the Commission in June 1998.
Despite substantial RES efforts to develop core research capabilities, we found several shortcomings in the process. In evaluating the need for core research capabilities, RES chose core areas judgmentally and not based on the 14 criteria approved by the Commission. The criteria and their related metrics, although not used to identify core areas, were not properly weighted to be of potential benefit. Also, it was unclear how or even if the criteria would be used to determine when sunsetting conditions are reached and projects closed out.
Finally, we found that RES staff did not know how the core research capability efforts would be used by the Commission. The usefulness of further core research capability analysis, as well as a broader core capability study for other offices, may be less than expected unless specific objectives as to the ultimate use are established. Issues resulting from our work are discussed in detail below.
Identification of Core Areas Not Based on Pre-Approved Criteria
Based on our review of SECY-98-076 and interviews with RES managers, we found that core research areas were not chosen by using the 14 criteria previously approved by the Commission. Rather, the core areas were chosen based on various input from outside sources, RES, and other NRC offices. Assessments using the 14 criteria were used to describe and confirm the core areas after the identification.
In SECY-97-075, NRC staff sought the Commission's approval of a methodology and criteria for evaluating the need for core research capabilities. SECY-97-075 contained 14 criteria for assessing the need for core capabilities as well as metrics within each criterion for rating the responses (high, medium, and low). In the staff requirements memorandum (SRM) that followed, the Commission approved the methodology and criteria and added the following: "The staff should ensure that the areas of research identified for assessment include those areas that are essential for the support of current and foreseeable future regulatory activities."
NRC staff completed its core capabilities efforts as reported in SECY-98-076 by including assessments (using the 14 criteria) on 29 areas. All research areas were considered "core" in the SECY. When asked at what point the assessments and supporting ratings would indicate that an area would not qualify as "core," former RES deputy division directors (who were heavily involved in the process) indicated that the assessments were not used for that purpose. One former deputy division director said that the assessments were to inform, assess work that is ongoing, and answer the question of why RES is conducting that research. SECY-98-076 itself states that the "29 areas represent the current collective views of RES and NRC user offices."
During a later meeting with RES' acting director, his staff noted that the assessments were used to confirm the core areas that were previously identified based on a variety of stakeholder and RES input. The staff also indicated that whether RES assessed all research areas using the criteria or (as in this case) core areas were first agreed upon and then confirmed using the criteria, the result would be the same.
Value of Weighted Criteria and Metrics Not Used
Notwithstanding the fact that RES did not use the 14 criteria and related "high, medium, and low" metrics to identify core areas, to be of use in the future, the value of the criteria and ratings should be recognized. An example shows that some of the 14 criteria are more important than others and perhaps should have been weighted for core identification purposes. As shown in Appendix II, support areas #1 and #2 address the technical bases for agency decisions on regulatory or safety issues. The four criteria under these two support areas would seem to be more in line with the Commission's guidance of focusing on programs with the highest safety and regulatory significance. As such, weighting these criteria should be of benefit for core identification purposes.
Another example illustrates the relative importance of criteria and the need for weighting. Five criteria fall under two support areas, "improve the technical bases of regulation through involvement in research with domestic and foreign organizations" (#5), and "respond to oversight groups" (#6).(3) In response to the Commission's SRM on SECY-97-075, NRC staff disclosed in SECY-98-076 that they did not weigh these areas in terms of importance but if they had, and only these two areas were impacted by contributions, then "there likely would not be a need for a core capability in that area of research."
On the importance of the metrics or ratings, RES had no scale to indicate the significance of high, medium, low, or none ratings or the various combinations within the 29 areas that were assessed. In SECY-97-075, NRC staff indicated that "low and none ratings [using the 14 criteria in the assessment form] would indicate the need for either a small core or perhaps that no core capability is needed." Under those guidelines, at least 3 of the 29 research areas may not have qualified as core.(4) Our analysis also showed that criteria in nine "core" areas were rated relatively high yet those core areas were designated as "sunsetted areas"(5) in relation to the fiscal year (FY) 1998 budget. High ratings in a core area would appear to contradict the designation of that area as a sunsetted area.
Use of Criteria for Sunsetting Unknown
In SECY-97-075, RES indicated that the approved criteria would not only be used to evaluate the need for core research capabilities, but they would also be used to determine when sunsetting (closure) would be reached. In SECY-98-076, however, RES did not equate sunsetting with closure, making it unclear how or even if the criteria would be used to determine when sunsetting conditions are reached and projects closed out.
Although the term "sunsetting" was not used in DSI 22, it was used in DSI 20, "International Activities." In the "Commission's Preliminary Views" section of DSI 20, there was guidance that "we need to examine individual international activities with respect to budget and priority to provide the basis for an orderly reduction and/or sunsetting of certain activities to meet expected future constraints on the program." The Commission also thought NRC staff should "identify areas where efficiencies can be considered and develop criteria for sunsetting certain activities." Our interpretation that sunsetting meant closure was confirmed in discussions with staff from two Commissioner offices.
In SECY-97-075, RES used closure interchangeably with sunsetting. As an example, RES referred to other taskings from the Commission/Chairman to include "sunsetting criteria that can be applied to determine when a research program should be brought to closure." In addition to situations in which NRC decides to not fund any more work, however, RES believed that sunsetting could describe areas in which NRC may be "conducting a small amount of research to sustain that expertise."
In SECY-98-076, RES defined sunsetting as "any area of research with a current budget that is at or below the expertise driven resource level." RES identified 10 (of 29) core areas that met this definition using the FY 1998 budget as a baseline. RES explained that "the level of resources in these [sunset] areas are driven by the need to maintain expertise, not the need to satisfy regulatory requirements." But RES did not describe sunsetting as closing out work. And in only one area (Hydrogen Distribution and Combustion) did it appear that NRC was conducting a small amount of research to fit RES' earlier definition of a sunset state.
RES' definition of sunsetting in SECY-98-076 prompted one ACRS member to ask about it in a June 1, 1998, subcommittee meeting with RES staff on SECY-98-076. The member suggested that RES redefine "sunsetting" to be more in line with the Commission's intent of the word.
Further complicating this issue is the fact that sunsetted areas for one budget year may not be sunsetted for the purpose of a future budget year. For example, while RES identified 10 research areas where the FY 1998 budget levels were at or below the core levels, our analysis containing the FY 1999 budget levels showed nine research areas meeting the definition of sunset. Of the 10 areas sunset in the FY 1998 schedule, five areas were no longer sunset in the FY 1999 schedule.
In its March 8, 1993, audit report on "Performance Criteria and Better Management Oversight Needed to Enhance NRC's Research Program Contributions," (OIG/92A-11), OIG found that neither NRC, RES, nor NRC's external advisory groups have developed criteria to determine when sufficient research has been conducted. While one recommendation in the report included establishing criteria, a related recommendation included terminating research when it does not meet the criteria. SECY-97-075 indicated that some progress was being made in establishing criteria for sunsetting. It contained the statement that "future assessments of core capabilities will provide a best judgment as to when the sunsetting point will be reached. This will be accomplished by annually applying the fourteen criteria ...." SECY-98-076, however, included a definition of sunsetting that did not address closure. Therefore, it is unclear how or if the criteria would actually be used to determine when real sunsetting points are reached.
Use and Usefulness of Core Research Capability Efforts Not Identified
NRC invested significant time and resources in the pursuit of core research capabilities; however, RES staff did not know how the core research efforts will eventually be used by the Commission. The use and usefulness of further core research capability efforts could help guide the staff in future analysis as well as the broader core capability study for other offices.
The stated purposes of SECY-98-076 are to: (1) provide the Commission with the results of the expertise-driven part of the core capability assessments performed by RES; (2) obtain the Commission's approval of these assessments; and (3) seek Commission endorsement of staff's plan for related follow-on activities.(6) Additionally, the recommendation of SECY-98-076 is for the Commission to note that the expertise-driven core research capabilities provided in attached assessment forms would be considered as part of the Agency-wide effort to perform core capability assessments as outlined in SECY-98-037.(7) Based on our review of these SECY sections, it was not obvious to us as to the ultimate use of RES' core research capability efforts. Nor was it identified during OIG discussions with the Deputy Executive Director for Regulatory Effectiveness, two former RES deputy division directors, one RES deputy division director, and the acting RES director.
Several of the staff we spoke with thought the exercise had budget implications. But none knew how the information would ultimately by used. ACRS, in its June 16, 1998, letter on "Review of SECY-98-076, Core Research Capabilities," referred to an "exercise that could have the ancillary benefit of providing the rationale and justification for maintaining a viable and robust research component within NRC." ACRS later referred to the budgetary process and that differentiation among the selected core research capabilities is essential when prioritizing resources. But follow-up with ACRS staff indicated that the Committee was unaware of the ultimate purpose of SECY-98-076.
While cost estimates of pursuing core research capabilities were not available, we believe they were substantial. RES managers began developing criteria for core research capabilities in early calendar year 1997. At least seven RES branch chiefs, three division directors, three deputy division directors, the office director, and administrative support staff were likely involved with developing SECY-98-076 from June 1997 to April 1998. This does not include staff from six NRC offices who were requested to provide comments on SECY-97-075 and SECY-98-076 and three meetings involving RES and ACRS members.
We do not agree with RES' approach or rationale for identifying core research areas. It preselected its core areas and did not use Commission-approved criteria as intended. As a result, core area selection was not based on objective evaluations. Furthermore, the selection was so broad that it included all research areas, and, more importantly, this process could continue during future annual reviews.(8)
Our evaluation also found limited value in core research capabilities criteria and associated metrics without proper weighting. In a related matter, it was unclear to us how or if RES would use the criteria to determine when sunsetting conditions are reached and projects closed out. We believe this is a very important by-product of the core capabilities efforts.
Finally, given the agency's investment in determining core research capabilities, it would seem to us that practical applications of this exercise would be readily known. However, we found this not to be the case. As a consequence, the usefulness of further core research capability analysis may be less than expected unless specific objectives as to the ultimate use are established.
If RES staff pursue core research capabilities in 1998 and thereafter, improvements are needed in the process. We believe core capability assessments should be applied to all possible research areas and should use properly weighted criteria and rating factors to determine what is core and what is not core. "Sunsetting" should be clearly defined consistent with previous Commission intent, and the criteria developed for core research capability purposes should be used to identify areas for closure. Last, practical uses of the core research capability efforts should be sought to help guide the process and determine if they are worthwhile.
As referenced earlier, there is a broader core capability effort planned involving the Office of Nuclear Reactor Regulation, the Office of Nuclear Material Safety and Safeguards, and the Office for Analysis and Evaluation of Operational Data. If other NRC offices follow a similar approach as RES, clear objectives and goals are needed that support the use and usefulness of this exercise, to make the most efficient use of resources. Furthermore, if a comparable process will be employed, it should have clear criteria, properly weighted, to ensure that core capabilities are objectively identified.
I. Objectives, Scope, and Methodology
The objectives of our evaluation were to assess the U.S. Nuclear Regulatory Commission's progress in identifying core research capabilities and determine what use will be made of this effort. We reviewed various documents associated with the core research capabilities effort and interviewed cognizant personnel from the Office of Nuclear Regulatory Research, the Office of the Chief Financial Officer, the Office of Human Resources, the Office of the Executive Director for Operations, the Office of the Commissioners, and the Advisory Committee on Reactor Safeguards.
To gain the nuclear industry's perspective on core research capabilities, we held a meeting with a representative from the Electric Power Research Institute. Our evaluation was conducted from April through July 1998.
REGULATORY FUNCTIONS THAT MAY NEED LONG TERM CORE SUPPORT AND CRITERIA THAT INFLUENCE THE CORE MAKEUP
(Criteria in SECY-97-075 Approved by the Commission)
|SUPPORT AREAS||CRITERIA FOR ASSESSMENT|
|1. Provide the technical bases for agency decisions on regulatory or safety issues stemming from power plant operations, events, materials uses or license amendment requests||1. Frequency of occurrence
2. Safety or regulatory significance, if they occur
|2. Provide the technical bases for agency decisions on regulatory or safety issues (including the resolution of generic safety issues) stemming from new or evolving technologies and/or research results||3. Likelihood of change
4. Safety or regulatory significance, if they occur
|3. Develop, maintain, and apply analytical tools/databases--maintain institutional technical knowledge base||5. Breadth and frequency of application of tools/data
6. Degree of improvement needed in tools/data
7. Value of tools/data/knowledge base to the regulatory process
|4. Provide the technical bases for improvements to regulatory framework (i.e. regulations, regulatory guides, codes and standards, new initiatives)||8. Need to improve requirements and/or guidance
9. Need to support new NRC regulatory initiative and/or approach
|5. Improve the technical bases of regulation through involvement in research with domestic and foreign organizations||10. NRC's commitment
11. Value of contribution to regulatory program
12. Leverage factor for NRC resources
|6. Respond to oversight groups (Commission, Congress, public, Advisory Committee on Reactor Safeguards, Advisory Committee on Nuclear Waste, Nuclear Safety Research Review Committee)||13. Likelihood of Occurrence
14. Complexity and significance of subject matter
II. Major Contributors to this Report
William D. McDowell
George T. Pourchot
David K. Horn
III. Glossary: Office of the Inspector General Products
1.INVESTIGATIVE REPORT - WHITE COVER
An Investigative Report documents pertinent facts of a case and describes available evidence relevant to allegations against individuals, including aspects of an allegation not substantiated. Investigative reports do not recommend disciplinary action against individual employees. Investigative reports are sensitive documents and contain information subject to the Privacy Act restrictions. Reports are given to officials and managers who have a need to know in order to properly determine whether administrative action is warranted. The agency is expected to advise the OIG within 90 days of receiving the investigative report as to what disciplinary or other action has been taken in response to investigative report findings.
2.EVENT INQUIRY - GREEN COVER
The Event Inquiry is an investigative product that documents the examination of events or agency actions that do not focus specifically on individual misconduct. These reports identify institutional weaknesses that led to or allowed a problem to occur. The agency is requested to advise the OIG of managerial initiatives taken in response to issues identified in these reports but tracking its recommendations is not required.
3.MANAGEMENT IMPLICATIONS REPORT (MIR) - MEMORANDUM
MIRs provide a "ROOT CAUSE" analysis sufficient for managers to facilitate correction of problems and to avoid similar issues in the future. Agency tracking of recommendations is not required.
4.AUDIT REPORT - BLUE COVER
An Audit Report is the documentation of the review, recommendations, and findings resulting from an objective assessment of a program, function, or activity. Audits follow a defined procedure that allows for agency review and comment on draft audit reports. The audit results are also reported in the OIG's "Semiannual Report" to the Congress. Tracking of audit report recommendations and agency response is required.
5.SPECIAL EVALUATION REPORT - BURGUNDY COVER
A Special Evaluation Report documents the results of short-term, limited assessments. It provides an initial, quick response to a question or issue, and data to determine whether an in-depth independent audit should be planned. Agency tracking of recommendations is not required.
6.REGULATORY COMMENTARY - BROWN COVER
Regulatory Commentary is the review of existing and proposed legislation, regulations, and policies so as to assist the agency in preventing and detecting fraud, waste, and abuse in programs and operations. Commentaries cite the IG Act as authority for the review, state the specific law, regulation or policy examined, pertinent background information considered and identifies OIG concerns, observations, and objections. Significant observations regarding action or inaction by the agency are reported in the OIG Semiannual Report to Congress. Each report indicates whether a response is required.
1. SECY-97-075, Methodology and Criteria for Evaluating Core Research Capabilities, dated April 2, 1997.
2. SECY-98-076, Core Research Capabilities, dated April 9, 1998.
3. See Appendix II for these details.
4. Radiation Dosimetry, Radiation Health Effects, and Hydrogen Distribution & Combustion Research.
5. Sunsetting will be discussed in the next section of this report.
6. These follow-up activities included an internal budget review for FY 2000, prioritization of research program components, and exploration of potential future research program.
7. SECY-98-037, Core Capabilities, dated March 4, 1998.
8. The plan to annually "revisit" core capabilities is discussed in SECY-97-075.