Sensitivity and Uncertainty Analyses Applied to Criticality Safety Validation: Methods Development (NUREG/CR-6655, Volume 1)

On this page:

Download complete document

Publication Information

Manuscript Completed: October 1999
Date Published: November 1999

Prepared by:
B.L. Broadhead, C.M. Hopper, R.L. Childs, C.V. Parks

Oak Ridge National Laboratory
Managed by Lockheed Martin Energy Research Corporation
Oak Ridge, TN 37831-6370

C.W. Nilsen, NRC Project Manager

Prepared for:
Division of Systems Analysis and Regulatory Effectiveness
Office of Nuclear Regulatory Research
U.S. Nuclear Regulatory Commission
Washington, DC 20555-0001

NRC Job Code W6479

Availability Notice

Abstract

This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use.

The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application).

The use of a generalized linear-least-squares methodology (GLLSM) tool is also described in this report. The application of the GLLSM tool in this work is largely to provide a preliminary understanding of the magnitude of the D and ck parameters and the number of experiments needed to rigorously define applicability and properly estimate the bias and uncertainty due to data. This work has determined that ck values of 0.80 and higher or D values of 0.40 and lower constitute systems that are similar to the extent that they are useful in the determination of bias and associated uncertainty for interpolation and extrapolation scenarios. Initial analyses have also shown that in order for the bias and associated uncertainty estimates to be meaningful, it is anticipated that about five or more very highly correlated systems (ck of 0.90 or higher) or more than about 10 moderately correlated systems (ck of 0.80 or higher) should be included in the validation exercises.

These methods and guidelines will be applied to a sample validation for uranium systems with enrichments greater than 5 wt % in Volume 2 of this document. This sample validation will compare these newly proposed methods with more traditional procedures. A side-by-side comparison of the results and procedures, along with guidance on their use, is also presented.

Page Last Reviewed/Updated Thursday, March 25, 2021