ACRS/ACNW Joint Subcommittee - May 4, 2000
UNITED STATES OF AMERICA NUCLEAR REGULATORY COMMISSION *** MEETING: ACRS/ACNW JOINT SUBCOMMITTEE U.S. NRC Two White Flint North, Room T2-B3 11545 Rockville Pike Rockville, MD Thursday, May 4, 2000 The committee met, pursuant to notice, at 8:30 a.m. MEMBERS PRESENT: JOHN GARRICK, ACRS Chairman THOMAS KRESS, ACRS Co-Chairman GEORGE APOSTOLAKIS, ACRS, Member GEORGE HORNBERGER, ACRS, Member. C O N T E N T S PAGE ITEM Introduction 3 Overview: Risk-Informing NMSS Activities 4 Status of SECY-99-100 Efforts: Training, Workshop, Criteria, Safety Goals 33 A Process for Risk-Informed Regulation of Activities 109 PRA for Dry Cask Storage 119 Nuclear Byproduct Materials Risk Review 201 Matrix Summary of Risk Assessment Results for Byproducts Materials Activities 201 Proposed Rule for Domestic Licensing of Special Nuclear Material 236 . P R O C E E D I N G S [8:30 a.m.] DR. GARRICK: Good morning. Our meeting will now come to order. This is a meeting of the joint subcommittee of the Advisory Committee on Reactor Safeguards and the Advisory Committee on Nuclear Waste. I'm John Garrick, co- chairman of the joint subcommittee. On my right is Tom Kress, also co-chairman of the committee. Joint subcommittee members in attendance are George Apostolakis of the Advisory Committee on Reactor Safeguards and George Hornberger of the Advisory Committee on Nuclear Waste. The purpose of this meeting is for the joint subcommittee to discuss the development of risk-informed regulation in the Office of Nuclear Material Safety and Safeguards, including risk-informing fuel cycle programs, integrated safety assessments, byproduct material risk analysis, dry cask storage risk analysis, the results of a public workshop on the use of risk information in regulating the use of nuclear materials and related matters. Sounds like a busy day. The subcommittee will gather information; analyze relevant issues and facts; and formulate some positions and actions as appropriate for deliberation by the full committees. Richard Major is the designated Federal official for the initial portion of this meeting. The rules for participation in today's meeting have been announced as part of the notice of this previously published in the Federal Register, part of the notice previously published in the Federal Register. A transcript of the meeting is being kept, and it's requested that speakers identify themselves, speak with clarity and volume so that they can be heard. We've received no written comments from members of the public, but we have received one request from Robert Bernero for time to make an oral statement, and right now, it's hopeful that we can fit that in right after our second presentation and perhaps just before or just after the break. Our first speaker will be Marty Virgilio, deputy director of the Office of Nuclear Materials Safety and Safeguards, and unless there are some comments or questions from the members, I think we'll proceed, Marty, and let you take the floor. DR. VIRGILIO: Thank you; good morning. I am Marty Virgilio, for the record, deputy director of our Office of Nuclear Materials Safety and Safeguards, and I'd like to start with a thank you to the committee members for taking the time to meet with us today. I look forward to a productive exchange of ideas. With me today, I have John Flack, who will be speaking to you next. You may know John from his responsibilities in the Office of Research, but John has been working with us for the last several months heading up this risk group that we have formed in NMSS. Also today, you'll have a chance to meet some of the members of the risk group. You may have met some of them before in the other responsibilities: Stacy Rosenberg will be with us today. Stacy has been a risk expert in the office of NRR, and I believe that at one point in time, that she may have briefed you on a risk assessment that she was responsible for on Kiwi Dam, where we were looking at the safety requirements necessary to ensure protection for the Oconee Nuclear Power Station. Stacy was responsible for that effort. We also have Dennis Damon with us today. Dennis is going to be talking us a little bit later about fuel cycle facility safety and the ISAs that are being performed or being required there as a part of our new Part 70. Jim Smith is here with us today. Jim is also a member of the group. He's our medical and industrial link, and Christiana Lew, which I know you've met and had interactions with Christiana over the high-level waste program. We also have Alan Rubin with us today. Alan is from the Office of Research. He will talk to us today about the PRA that Research is doing to help us in understanding some of the issues surrounding high level waste storage, and Betsy Ulrich will be coming down from region 1. Betsy is not here yet but will be making a presentation to you later this morning or early this afternoon on a material risk study that we had done. What we have in store for you all today is an exchange on the overview of our program and the status of our current activities, and then, we'll have the individual presentations on the cask storage, on Part 70, the fuel cycle facility requirements and the ISAs and also on the by- product materials risk review study. If I can have the first slide, please. Next slide, please. Just by way of background, and I know you're all familiar with this, but I thought it just again for the record, and maybe the audience, going through the chronology; we started, I think, in earnest looking at risk- informing the waste and material activities in 1997 with a commission paper where we laid out some ideas as to direction and received some feedback from the Commission on our approach. And in March of 1999, we put forward a Commission paper, SECY 99-100, that provided a lot more detail on our proposed approach, areas where we were considering risk-informing our programs and activities. The Commission responded to that SECY paper in June of 1999 with an SRM and provided more direction to us, additional ideas and thoughts about which directions that we ought to be proceeding in, and in July of 1999, we established a task force to move this effort forward. On the next page, if I could have the next slide, please, just going back to SECY 99-100, we had a number of recommendations in that, the first being that we proceed to implement a five-step process, and I've got another slide in a few seconds I'll show you on that five-step process and that we continue to implement our approaches for addressing risk management issues, our ongoing activities and that the Commission approve the formation of a joint subcommittee that would help us with constructive criticism and peer review of our ongoing activities and ideas for directions in the future. On the next slide, you can see the SRM, and basically, the Commission accepted the proposal that we laid out, those three ideas I put on the last slide, and in addition provided some additional guidance to us. They asked us to develop materials safety goals; they asked us to make sure that we were using an enhanced participatory process to develop the goals and include within the goals the avoidance of property damage. They asked us to consider critical groups and whether critical groups could be defined like we have done in the high-level waste forum for other activities in assessing risk and managing risk. DR. APOSTOLAKIS: What is the logic behind avoiding property damage here and not including land contamination for reactors? DR. VIRGILIO: None that I -- you know, and I -- it's a good question as to where we -- DR. KRESS: I would guess, George, that they're focusing on the Yucca Mountain type issue or -- you'd probably only get property damage as a real consequence in the accidents. That would be my guess. DR. VIRGILIO: Tom, as we go through and look at some of the material activities that we have and some of the issues that we're dealing with today, we're not only dealing with radiological contamination, but if you think about issues that we're dealing with like Atlas, Moab, we're also dealing now with more environmental issues in the waste and material arena than on the reactor issue, and I think this was on the Commission's mind -- DR. KRESS: Yes, I think -- DR. VIRGILIO: -- at the time that they were generating this SRM. It was not only the repository, but I think it was some of these other issues. DR. KRESS: Right; it's environmental contamination in general. DR. VIRGILIO: The SDNP sites that we're involved in; there are so many different issues that we're involved in today that includes the environment and property considerations that I think it was only logical that they went that way. But then again, you say, well, why aren't we going in that direction for reactors, which I think, you know, is a little outside my scope but might be something to consider as we move forward in that front. DR. KRESS: I think we ought to do it. DR. APOSTOLAKIS: Yes, but, I mean, the staff has come back and said that they will not include it, right? DR. KRESS: They only include it in regulatory analysis. DR. APOSTOLAKIS: Yes, but not part of the quantitative -- DR. KRESS: They don't have quantitative safety goals. DR. APOSTOLAKIS: Yes. DR. KRESS: There are some people I've heard say that it's subsumed within the two goals that they have to some extent. DR. APOSTOLAKIS: Yes. DR. KRESS: But I don't believe that. DR. GARRICK: Okay. DR. VIRGILIO: Okay; and last, they asked us to ensure that we include the agreement state component in our thought processes; today, we have 31 agreement states, and there are four more eager to join the program and so -- DR. APOSTOLAKIS: On the previous slide, again, I have another question. DR. VIRGILIO: Sure. DR. APOSTOLAKIS: What's a critical group? DR. VIRGILIO: In the context of Yucca Mountain is the best way that I can describe it -- DR. APOSTOLAKIS: Yes. DR. VIRGILIO: -- by example is that we are looking at the effects of the plume that could be predicted to leave the repository; on a group -- a hypothetical group, a farming community living within the vicinity of the repository, and we're looking at the effects on that group. We're looking at how much will they receive in terms of dose as a result of hypothetical accidents that could occur at Yucca Mountain. You could think about critical groups in terms of transportation, another example that we haven't gone down. This is my example, but you could think about transporting a cask down the road and possibly having a critical group or a target for assessing risk to the public as a family in a car driving alongside that cask or, you know, or somebody -- or the folks that work at the truck stop, where the truck might stop on its route. DR. APOSTOLAKIS: Yes. DR. VIRGILIO: So it's hypothetical groups of members of the public and how they might be exposed to radiation as a result of accidents, upsets or normal activities associated with the program. DR. APOSTOLAKIS: Now, why don't we use the term in reactors? DR. KRESS: Critical groups? DR. APOSTOLAKIS: Yes; I mean, we use the idea. DR. KRESS: Yes, we use the idea. DR. APOSTOLAKIS: But not -- is part of this joint subcommittee trying -- part of the purpose of existing here is to harmonize the terminology, perhaps? I mean, what's so different here? I mean, you have to try a little harder, I think, to define the critical group. There was a controversy in that Academy report regarding what the group is because of the huge time scales. But it would be helpful, I think, to start using the same terms. DR. HORNBARGER: What term is used in reactors? DR. APOSTOLAKIS: Nothing; we just say individual risk, societal risk. DR. HORNBARGER: But you have to use an N individual. DR. APOSTOLAKIS: Yes. DR. HORNBARGER: So it's an individual and not a group. DR. APOSTOLAKIS: It's not a group at this time. DR. KRESS: It's a group averaged into -- DR. HORNBARGER: Okay; so, it's the same idea, then. DR. APOSTOLAKIS: Well, the way it's calculated -- DR. HORNBARGER: It's the individual; it's the sort of average individual in a critical group. DR. APOSTOLAKIS: Exactly. DR. HORNBARGER: Yes. DR. APOSTOLAKIS: Exactly; but it's supposed to be, you know, the community surrounding the reactor, I suppose. DR. HORNBARGER: Yes. DR. APOSTOLAKIS: You don't have to make any hypothesis, because you know who they are. DR. KRESS: Yes; it's the ones living around the -- DR. APOSTOLAKIS: Yes. DR. HORNBARGER: Right. DR. APOSTOLAKIS: So either there is no -- DR. HORNBARGER: Well, you still have to make an hypothesis, because the community can grow -- DR. KRESS: Yes. DR. APOSTOLAKIS: That's right. DR. HORNBARGER: -- over the lifetime of the reactor. DR. APOSTOLAKIS: That's right so -- DR. KRESS: That's supposed to be part of the analysis, protecting that. But you don't have to worry about 10,000 years either. DR. HORNBARGER: It's a better guess. DR. APOSTOLAKIS: Yes. DR. VIRGILIO: Next slide, please. I think we've covered that one. Just to go back to SECY 99-100 for a moment, the Commission endorsed the staff's approach to a five-step process. Those steps really boil down to identifying the candidate applications, where we would want to move forward and risk-inform our programs; decide how we would then modify our regulatory approaches; change the approaches; and then implement the program, and in parallel with that, we would be developing or refining the risk tools that we have available to us through the reactor program and through our own program activities. If we go to the next slide, after our last interaction, and sometime in the November time frame, you wrote a letter to Chairman Meserve and recommended that we do a number of things. We saw that the two key things within your recommendations being that you asked us to develop a set of principles and a safety goal approach for each of the NMSS-regulated activities, and you asked us to identify analytical methods to be applied to implement these approaches on an application-specific basis. If you go to the next slide, Pat, we wrote you back in January, and we told you that we would develop screening criteria, and here we were being or trying to be as responsive as we could to your recommendation on a set of principles, so this is the way we were approaching it, by developing screening criteria for determining what we would move forward with and risk-inform in specific applications. We also said that we would examine experience with risk assessment methods, measures and metrics currently being applied. Here again, we were being responsive to your recommendation that we look at our analytical methods. In addition to responding to your recommendations, we told you that we were going to move forward using the enhanced participatory process; scheduling meetings and workshops with interested parties and that we would begin to develop our training program. If you go to the next slide, please, what I want to do is give you a little bit of status on where we are on what we told you we were going to do. What we did is we developed and published in the Federal Register some draft screening criteria for identifying those areas where we should apply risk-informed approaches. We solicited public participation in the development of the screening criteria and safety goals in a workshop we had just a couple weeks ago. We are currently in the process of examining our methods, measures and metrics that we would apply in NMSS to confirm that we are on the appropriate approaches, and today, we'll share with you some of the specific applications and some of the methods that we're using and hopefully get some feedback from you on whether we're on track with regard to those programs. And lastly, I think we've made significant progress in developing our training program. On the next slide, I just highlight some current activities we have ongoing a little bit outside of the scope of what we've been corresponding with you on. We've made some organizational changes. We've brought the risk group that we had formulated and residing in one of our technical divisions up to the front office, so now, the risk group reports directly to Bill Kane and I, and we've established a steering group, and I think you'll recognize some of the members of the steering group. They include, within NMSS, Don Kuhl and John Grieves and Mike Weber. And then, from outside NMSS, Gary Hollohan, Tom King from research; Bruce Malik from Region II and Joe Gray from OGC. In addition to supporting Tom, we also have Joe Murphy, who has been in and out of this process, but I think we've got a very strong steering group now to help ensure that we stay on track. And today, we'll talk a little bit about some of the activities that we have ongoing, including the ISAs for the fuel cycles; the byproduct material risk analysis; the PRA for dry cask storage, and I understand that the full committee got a recent briefing on transportation and where we're going on our risk studies there. So we didn't put it on the agenda today, but we could come back to you again in the very near future and give you an update on where we're going on our transportation risk studies. On the next slide, just highlighting that we're increasing our interactions with the stakeholders. We had a Commission meeting in March on the risk-informed regulatory implementation plan. A subset of that is where are we going in the waste and materials arenas, and we had the public workshop in April that I mentioned earlier. And John Flack is going to go into a lot more detail about what we heard from the stakeholders at the public workshop and some of our analysis of those thoughts. John will also talk about the three-tiered approach to training that we have, and just so you understand just the background, we thought about it in terms of tiers and the first tier being the managers, making sure they had a fundamental understanding of what we were doing and then the second tier being all the staff, all the technical staff that we have in the waste and material arenas, including the staff working in the regional offices and making sure they had a basic understanding of what we were doing in these programs and then the third tier being more advanced training for the people who would actually be employing these risk assessment methods and using risk management techniques as appropriate that come out of the analysis. So it's a three-tiered approach, and again, John Flack will explain that in a lot more detail. Just to highlight and introduce what John will talk about on the next slide is the April workshop. We had participants from other Government agencies. We had other representatives from all of the regulated industries; public groups and other interested parties participating, and transcripts are now available. We're having copies of the transcript made so you can see that, and I think it was a very good workshop. Everybody was well-engaged, and we got a lot of good feedback, and John is going to share with you specifically some of the ideas in detail. The focus of the workshop was basically two-fold. The first part of it was looking at the screening criteria, and we introduced that screening criteria and took comments on it; looked at examples. We actually asked the -- challenged the group that said not only give us comments on the screening criteria but other examples of various -- where we should move forward independent of the screening criteria and also pilot applications, and we'll talk a little bit more in detail about that. And we solicited input on development of the safety goals. We had this laid out for a day and a half, and quite truthfully, I thought we were going to spend most of the time talking about the screening criteria, but when I go back and look at the transcript, we spent most of the time talking about the safety goals, which was very productive. I think it was a really good meeting and a lot of good ideas on how to proceed with a process for developing safety goals. DR. KRESS: When you talk about safety goals in this arena, are you talking about some sort of risk acceptance criteria for individual facilities? DR. VIRGILIO: At this point, what we're doing is trying to decide how best to attack this, and we had thought about maybe going down seven paths consistent with some of your guidance. We have seven programs within the NMSS waste and material program, and I think we're refining our thoughts on that. There may be a better approach and maybe bringing that down to five and individually maybe working forward in some way to define safety goals in those five specific areas, and John will get into a lot more detail about this but not necessarily on a facility level but starting maybe on one area, on medical, on another area, maybe on the facilities, on the industrial facilities and try to define goals in each of those areas and then seeing if we could step back and say is there an overarching goal, you know, that would cover the five or the seven areas but working from the bottom up, working in areas and building to see if we can get some overarching safety goals. DR. KRESS: On a more general level, when you say goals, is that something to be strived for or something that has to be met? DR. VIRGILIO: No, we're thinking in terms of a hierarchy of overarching goals that would then be supported by regulations that would have to be met. DR. KRESS: Okay. DR. VIRGILIO: And we're also trying to make sure we have a clear idea in our minds of how these overarching goals fit within the context of the Commission's strategic goals. We've got a -- I don't know if you've had a chance to see the latest strategic goals and performance goals that the Commission is now finalizing, but there has to be a hierarchy, I think, between these goals, the Commission's safety goals and performance goals and then the regulatory requirements. DR. KRESS: That disturbs me a little, because I don't see the connection, frankly. DR. VIRGILIO: And we have to make that connection. We have to do that. DR. APOSTOLAKIS: But if you find goals, and then, you have regulatory criteria based on those goals that must be met, aren't you implying that the goals are in fact defining adequate protection? DR. VIRGILIO: I would rather stay with the regulations defining the adequate protection and the goals being an overarching framework. DR. APOSTOLAKIS: So the regulations, then, will not be derived from the goals. DR. VIRGILIO: The regulations have to be derived from the goals and consistent with the goals. DR. APOSTOLAKIS: Yes; consistent, I understand. DR. VIRGILIO: And any new regulations -- DR. APOSTOLAKIS: You have to be careful here. DR. VIRGILIO: Yes. DR. APOSTOLAKIS: You know, the distinction between goals and adequate protection. DR. VIRGILIO: We have to be abundantly clear in defining that, and right now, I think we've got, you know, and we've got goals; we've got strategic goals; and we've got regulations. And as we move forward in the waste and material arena, it's critical to us that we understand the linkages and relationship between those three components of our framework. DR. KRESS: Normally, the goals put forth in the strategic plan are sort of a measure of how well NRC does its job of overseeing. Now, we're talking about a regulation that deals -- regulations that deal with the actual design and implementation thing. It seems to me like those are two separate things and not necessarily -- they don't necessarily have to be related to each other at all. DR. VIRGILIO: I would say that the goals, you know, are the outcomes that you're trying to achieve: no deaths; you know, no destruction of property, you know, no loss of property. I mean, those are the goals you're trying to achieve, and the way you achieve -- DR. KRESS: When is that applied? Over the next year? The next 5 years or -- DR. VIRGILIO: Well, the strategic goals are meant to be enduring. If you think about the Commission's strategic goals, they're meant to be long-lasting, enduring goals. DR. KRESS: No deaths forever. DR. VIRGILIO: Right. DR. KRESS: Well -- DR. VIRGILIO: No deaths. The performance goals that they have in that same book are meant to be 5-year goals. They're meant to be -- let's see how we're going to do for the next 5 years in meeting those overarching, enduring requirements. And then, I see the regulations as being the mechanisms, you know, the requirements that we're going to impose on the regulated communities for meeting, you know, to help ensure that we achieve those outcomes. DR. KRESS: Yes; well, that's what bothers me, because your regulations are sort of one-time things. These goals are going to be re-established year after year after year. Are you going to change the regulations to meet the new goals, or are you going to set up your regulations based on another set of criteria and then worry about the goals when you talk about inspection, operations and other things? You see, that's the connection that bothers me. DR. APOSTOLAKIS: It's not clear to me at all what the role of the goals ought to be in the regulation, because the regulations, really, are dealing with adequate protection. DR. KRESS: That's the other thing that bothers me, absolutely, George. DR. APOSTOLAKIS: I mean, if you go to reactors, and it takes even at 10-4, which is a subsidiary goal for core damage frequency -- DR. VIRGILIO: Right. DR. APOSTOLAKIS: -- I don't think you will find any regulation that is derived from that. DR. KRESS: Except the backfit rule. DR. APOSTOLAKIS: Well, then, there's one. DR. KRESS: But, you know, that's a special. DR. APOSTOLAKIS: And we have plants right now that have CDF above the goal, and they're allowed to operate. So I think that there is a real issue here, the distinction between goals and adequate protection, and there has been reluctance to define adequate protection quantitatively, not only from you, or I don't even know whether you are objecting to it, but in the reactor arena, we were told that they would rather stay away from it, because adequate protection is not just a number. It's the result of a whole process, where the numbers are only part of the process. But I think there is a real issue there: how do you interpret these defined criteria and quantities. DR. GARRICK: But this is an old issue, George. DR. APOSTOLAKIS: Yes. DR. GARRICK: I'm looking at a SECY 89-102 written in 1990, and it's pretty clear on its distinction between adequate protection and goals. It says the Commission believes that adequate protection is a case-by-case finding -- DR. APOSTOLAKIS: Yes. DR. GARRICK: -- based on evaluating a plant and site combination and considering the body of our regulations. DR. APOSTOLAKIS: Right. DR. GARRICK: Safety goals are to be used in a more generic sense and not to make specific licensing decisions. DR. APOSTOLAKIS: Yes; but that was in 1989. DR. GARRICK: Yes. DR. APOSTOLAKIS: Now, we want to use the goals on a plant-specific basis. DR. KRESS: To risk-inform the -- DR. APOSTOLAKIS: Which would upset this, I mean, it would change that. DR. KRESS: Yes; I think that would upset that concept. DR. APOSTOLAKIS: But the question is how do you use it? If you adjust the goal rather than the definition of adequate protection; I mean, they're two different things. So you're going to have the same problem here, I think. DR. VIRGILIO: And this was discussed; you can see in the transcript a number of the stakeholders raised this issue of the relationship between the goals and the regulations, and I think, yes, we have a challenge. On the next slide, John will go into a lot more detail, but I just wanted to sort of ground you at a fairly high level as to what were some of the recommendations we got from the participants at the workshop, and basically, they had a number of comments with respect to the screening criteria. There was, I think, a very strong consensus to pursue safety goals and to do it as a series, not to try to start with one single goal but to work down parallel paths looking at the groups of activities that we do and see if we could establish goals for individual groups first. If you go to the next slide, the participants also recommended that we summarize the results -- DR. APOSTOLAKIS: Excuse me again. Let me understand the criteria; I'm sorry. DR. VIRGILIO: Okay. DR. APOSTOLAKIS: The criteria will be used how? DR. VIRGILIO: The criteria that we put out at the workshop -- DR. APOSTOLAKIS: Yes. DR. VIRGILIO: Actually, we put a Federal Register out first and then discussed at the workshop would be used to identify new areas where we would move forward to risk- inform. That was the purpose of the criteria. DR. APOSTOLAKIS: Oh, oh, oh, oh, oh. That's different. DR. VIRGILIO: Why would you go about risk- informing a new activity or an existing activity? And so, we laid out a number of criteria that one would have to meet in order to decide. And we took this from your recommendations on principles. DR. APOSTOLAKIS: Yes. DR. VIRGILIO: For how do you go about approaching risk-informing your program? You suggested that we define some principles. And so we, instead of calling it principles, we called it this criteria and screening criteria. DR. APOSTOLAKIS: So the criteria are not to be used to identify unacceptable risks. DR. VIRGILIO: No, sir; it was strictly to say that this was an area where it was ripe for a risk-informed approach, and so, we would look at it using this screening criteria. DR. APOSTOLAKIS: Another dream crushed by reality. I thought you were going to define that, which would have been a definition, a semi-definition of adequate protection. DR. VIRGILIO: Not yet. DR. APOSTOLAKIS: Not yet; right. DR. KRESS: They do have what they call quantitative acceptance criteria in this document here, which is a good step in the right direction. DR. APOSTOLAKIS: Okay. DR. VIRGILIO: Other participant recommendations are just to summarize the results and inform the Commission and hold more workshops. There was a desire on a number of the stakeholders to get out and talk -- and we're talking about regional areas, not necessarily NRC regional areas but to get some more local input on the development of the goals, local values, local desires. There was a desire that we continue to work in a - - with the group that we had or similar groups to develop the safety goals through a consensus process, and there was also, I think, a general agreement that we can develop safety goals in parallel with continuing to risk-inform our processes, not that we have to have the goals first. That pretty much summarizes what I wanted to tell you. If you have any additional questions on my presentation, I'd be happy to take them now. John Flack is going to provide a lot more detail on the training program, the workshop recommendations, and John will actually talk as well about some of our next steps: where do we go from here? And any questions for me before I turn it over to John? DR. KRESS: Yes; on your concept of having several groups of activities, each of which would have its own safety goal, do you have some overriding principle that would integrate those and make them all consistent? And what I have in mind there is some sort of cost-benefit. Here's this activity. It has some assessed benefit to society. Therefore, we're willing to accept some cost of having an accident due to that as a result, and maybe that cost versus benefit could be the same number for each of the activities, and since you would have different benefits, you would have different costs, which would lead to different acceptance criteria. Do you have some sort of overriding principle like that that you're trying to use? DR. VIRGILIO: Not yet, but I think, you know, we want to go to some overarching principle that would if not provide consistency provide harmony or at least a logical approach to looking at each of the five or seven areas that we proceed down. I think we have to do that, but we haven't thought through to the point of on what basis would you establish that overarching view? DR. KRESS: Well, I think you have to have something, because you can't -- I mean, it's -- DR. VIRGILIO: Yes. DR. KRESS: -- it's going to be incoherent if you just pick each one out of the air. There needs to be something to tie them together. DR. VIRGILIO: And we also -- and I know the Commission has given us some guidance in this area, but we also want to look across at the reactor population, too, and make sure that there is some logic to what we're doing vis- a-vis what they're doing in terms of safety goals as well. DR. KRESS: Well, they don't have this overriding principle there either, unfortunately. It's something we called for, but it doesn't exist. I mean, if you looked at the prompt fatality and the latent fatality and the non- existent land contamination and other societal risks, there's no overriding integrating factor for those. DR. VIRGILIO: While we believe at this point starting from the bottom up, working in these areas would be helpful, we all, I think, agree that that's what we need: an overarching principle that would tie this together or a framework that would tie this together so that logically, you could look across the entire scope of our activities and say -- DR. KRESS: But each one of them has a -- DR. VIRGILIO: It fits. It fits within a framework. DR. GARRICK: Tom, you don't think the qualitative statements that preamble the safety goals are in the category that you're talking about? DR. KRESS: In a sense, when you say small increase over -- DR. GARRICK: Yes. DR. KRESS: -- existing risk, that is a type of -- DR. GARRICK: Right. DR. KRESS: -- of thing, but those two are inconsistent, because what happens is only one of the goals, almost 90 percent of the time, controls because they're on an inconsistent basis. They're not tied together in a sense. DR. VIRGILIO: Right . DR. KRESS: For example, why should the safety of nuclear power be tied to automobile accident deaths? I mean, that doesn't make any sense at all to me, and that's basically what it is: it's tied to the automobile accident deaths. But why? There's no reason for that. So there's no real -- DR. APOSTOLAKIS: These are policy issues, Tom. Why should there be a reason? DR. KRESS: Because you have to have these acceptance criteria before you can risk-inform, and you've got to start somewhere. DR. APOSTOLAKIS: Yes. DR. KRESS: And if you just pick values out of the air, you're going to end up with an incoherence in the system. You know, it may not be a bad incoherence, because you could look at each group separately but -- DR. APOSTOLAKIS: But that will be a societal incoherence. DR. KRESS: It's a societal issue. DR. APOSTOLAKIS: But not the Nuclear Regulatory Commission incoherence. DR. KRESS: Oh, yes, but society is not going to come knocking on your door and say here's what we want. The Nuclear Regulatory Commission has to develop these itself somehow. Nobody is going to come and hand them to you? DR. APOSTOLAKIS: No, but the one-tenth of 1 percent is clearly a policy issue. DR. KRESS: Sure. DR. APOSTOLAKIS: And it can be a logical one. DR. KRESS: These are all policy issues. DR. APOSTOLAKIS: Yes. DR. KRESS: But what I'm saying, there's a need for a coherent basis for such policy issues. DR. GARRICK: But the one-tenth of 1 percent could be viewed as an interpretation of what is meant by the qualitative overarching. DR. APOSTOLAKIS: Yes, that's what I'm saying. DR. GARRICK: The goal that -- yes. DR. APOSTOLAKIS: I mean, Tom, you asked what is the reason behind it? Well, there is no reason. There is no logic. I mean, this is it. All we want is these risks from these technologies to be lower. DR. KRESS: But there could be some logic is what I'm saying. That's my problem. And the logic could be we're willing to accept so many deaths, which means so much money and multiplied by money per death for automobiles just because it has certain benefits to society. You can quantify that benefit to some extent. We ought to be able to say what's the benefit of nuclear power to society? And we ought to be able to accept the same costs, that is, the same number of deaths at the same frequency. And that's an overriding principle that just doesn't seem to be evident anywhere in these regulations. DR. APOSTOLAKIS: Well, but then, you would run into other problems. DR. KRESS: Of course you've got the problems, but you -- DR. APOSTOLAKIS: The threat element of risk; the catastrophic potential. DR. KRESS: Of course, you've got to -- we've got to think about those things and factor them into your decision in some way. DR. HORNBARGER: Your argument is basically one of risk harmonization in the jargon? DR. KRESS: Yes; that would -- it could be called that, yes. DR. HORNBARGER: I mean, we would prefer to see NRC regulations risk-informed the same way that EPA regulations or Department of Transportation regulations, and they'd all be based on how many, so many thousands of dollars per statistical death. DR. KRESS: Something like that, yes. You know, you need some overriding principle that -- DR. APOSTOLAKIS: That's difficult to defend, yes. DR. KRESS: Anyway, it's just a thought. DR. VIRGILIO: John? DR. FLACK: Okay. DR. APOSTOLAKIS: No high-tech for you, huh? [Laughter.] DR. FLACK: Good morning; I am John Flack, the risk task force leader. The risk group in NMSS is a group that acts as a focal point of all risk-informing activities that are going on in the office, and so, I have headed that group up now for the past 3 months, and today, as Marty had mentioned, I plan to cover two areas. One is the training that is being developed for the office to bring them up to speed in using risk concepts, and the second is to provide you with an overview of the feedback from the public workshop in the two areas; the screening criteria that was sent out for distribution as part of the Federal Register notice and the entertainment of potential safety goals for nuclear materials and disposal of nuclear waste. So moving on to the first topic, which is training, as was mentioned earlier by Marty, we plan to use a three-tier approach to train the office on risk, risk concepts. The first year is a management and supervisory level of training program that was really a roll-up from tier two, which is being developed as a pilot, and we expect to exercise that pilot this fiscal year, and I'll go into the outline of what we're following on that pilot program. And then, there's tier three, which is really targeting those that are specialists in risk, who use risk on a day- to-day activity, and some of these courses would be given in-house, and some would be taken through other agencies, and we're developing a list of those courses. There's two things that need to be considered while we're developing the training program, and one is the fact that we also, and NMSS also has regional staff that we need to bring in for training to make sure that it's properly implemented risk concepts, and secondly, we need to think about the agreement states, and one of the issues that you'll see that came out of the workshop is that if we're moving in a risk-informed direction, how do we train those that are out there in the agreement states as well with their limited amount of resources? So these are two things we need to consider during development of the program. Now, with respect to laying out the program for training, I kind of followed the Farman approach. Richard Farman used a concept when he taught students about physics; he said first, you tell them why they need to know it, and then, you tell them what they need to know, and then, you let them determine for themselves how to do it. Well, we do all three here, though. The approach that was laid out was we would go through first explaining why we need to use risk up front; why it's important to use it and why the Commission has considered it to be important and then go into the methodologies: what are the methods that we would use to carry out this concept of using risk? And then, finally, how it's applied through application, specific applications as examples. So that's the kind of thinking that when we laid out the training program. And as you can see, in the why of why we are using risk, it's interesting to note that the first topic of discussion in the introduction of that area is adequate protection and safety goals and why they are different in that regards based on an historical perspective as well as where we're trying to blend the two today; it was interesting to think about in that context; but basically, from where we have been, safety goals, trying to be something we aspire to which involves performance as well as defense-in-depth that has already been instituted at the sites to adequate protection as generally defined in the strategic plan as meeting the bulk of the regulations. So just getting that out as the differences in concepts to the students and how we are trying to blend the to understand one with regards together is something that really should be discussed up front. DR. KRESS: When do you define adequate protection as meeting the bulk of the regulations? And then, you're -- you set out on an activity where you're making a wholesale modification of the regulations? How useful is that definition to you in doing that activity? DR. FLACK: Well, I think -- yes, I think you're touching upon a point of public confidence. I mean, when we have established adequate protection, it's something that we believe that we have done with our regulations and that plants do adequately protect the public. I mean, that's the position that you're coming from up front, and the regulations are trying to establish what that is. I mean, they do establish what that is. But there's also goals that you want to aspire to, and I think this is where you stand back, and you look at the broad set of the regulations and how they're being implemented and what they're achieving and then what are we trying to achieve, well, in steps to safety goal? Well, the safety goal is a way of articulating what we're trying to achieve. It's something that you try to aspire to. And I think in that context, one can, you know, sort of understand what role each of these play. Now, ultimately, it would be the -- ultimately, they should come together. When we have all the answers, I think it will come together. But as this continues to evolve, and we get closer and closer as we evolve, I think we're coming closer and closer together. But I don't know if we have all the answers yet. I mean, we always have to question that, you know. And so, I think that goals shouldn't be met; they should be something that we're saying here's what I'm trying to aspire to. I mean, in some cases, you might meet them, and that's fine, but it's not something that should be in the sense of a requirement, you know, that you have to meet these goals. So again, why? Why do we need risk? We have the PRA policy statement that goes through a number of reasons why we think it's important, the Commission thinks it's important to use risk concepts, and those would be explained, of course, as part of that up front discussion and class. And then, we talk about the strategic plan and what's the relationship between the strategic plan and the risk-informed regulation implementation plan, which you'll soon hear about; this is the one that the staff has been working on and presented to the Commission I guess it was about a month ago. Now, and I visualize the strategic plan again as those strategic goals on top, and then, we have performance goals which we expect to meet, and then, between the two, you need some implementation plan. How do you go from your strategic goals to how you're measuring or what's causing you to measure what you're measuring? And that, to me, I envision that as the piece, the dovetailing of the risk- informed regulation implementation plan. So with that fitting together like that, that would explain why we're -- and risk-informing the plan itself would explain how the policy itself is being implemented, the Commission's PRA policy statement. So I see those two as dovetailing, anyway, in that sense of the word, but we're still working on developing the risk-informed regulation implementation plan and laying that out and how that will -- then, we will get down to the regulatory activities and how those are being risk-informed. So I see that as an introduction to the students as to why we're using risk. And then, we move on to the principles of risk-informed integrated decision making, which is Reg Guide 1.174, where there's a lot of information there that is generic, which is the way we do business using risk, and that carries over certainly to the other fields. And then, as we move through this cost, we begin at a higher level moving down now to its actual applications to materials and waste disposal, and right now, the document that really outlines that is SECY 99-100, which you're familiar with and how we're implementing that process. And then, of course, the connection with respect to nuclear materials and waste of how we use risk and rulemaking, licensing and inspection and assessment as being those three key regulatory areas by which we operate and drawing that connection between -- to risk and the regulations. So again -- DR. GARRICK: John, I hope that the emphasis on this, on the training is not so much on the why being answered in the context of because we have rules and regulations and what have you but rather what's behind the reason we have the rules and regulations. I don't think the American public is all that impressed with the why being answered, being given in the context because it's required by the regulations. I'm just suggesting -- and I hope the emphasis is on the merits of a risk-informed approach. DR. FLACK: Yes. DR. GARRICK: The merits of a risk perspective, because this is not very impressive to me just to see this list, because, you know, I don't -- I'm a member of the public; I don't trust the Government. I want something more basic than safety goals and PRA policy statement and strategic plans and regulations and what have you. I want somebody that really understands what we're going to benefit from in taking a risk view here and that understands what risk is. So I hope the emphasis is on that. DR. FLACK: Yes; I would say -- well, I guess there's a couple of things. I understand the emphasis that you're making as certainly important to make. The objective -- this gets back to the objectives of the course, and what I'm laying out is more of an outline with the objective in mind that we want to bring the staff up to speed in risk, and we want the staff to understand why risk is important to use it, why it's important to know, and how does it fit in? So the emphasis was more on the staff itself rather than the public domain. So there is a part that I'll get to in the end about risk communication, and that is how do you communicate to the public. DR. GARRICK: Yes. DR. FLACK: But that comes at the very end of the list, and this is only meant to be the establishment of an outline. I mean, there is a lot of meat that needs to be put on the bones. DR. GARRICK: Yes. DR. FLACK: But it's just the way we're thinking through the process of training the staff in the Office of NMSS -- DR. GARRICK: Yes. DR. FLACK: -- to appreciate the views on it. DR. GARRICK: But my point is that the real reason we're doing risk assessment is, A, we want to know what the risk is in a realistic fashion; that it's something more valuable to us than a bounding analysis. That's the real merit of a risk assessment. And number two, we want to know what's contributing to the risk. DR. FLACK: Yes, absolutely. DR. GARRICK: And so that we can do something about it. DR. FLACK: Yes. DR. GARRICK: So that we can manage it. And when I see a list like that, I don't see -- DR. FLACK: Well, I get into the other pieces. DR. GARRICK: Yes. DR. FLACK: This is still at a very high level. DR. GARRICK: Right; I understand. DR. FLACK: And now, we get into the next piece. Let me move ahead to the methodologies and then what comes out of those methodologies and then how do we use those findings. DR. GARRICK: Yes. DR. FLACK: So this is still establishing the up front why risk is -- why we're using risk, and certainly, the outputs and the outcomes of using risk is something that's important. DR. GARRICK: Yes. DR. FLACK: You know, that adds to what our knowledge base is, and really, that's the main thrust. DR. APOSTOLAKIS: But why we're using risk; I think part of what John said answers that question in the motivation behind it so it belongs here really, doesn't it? That you really want to avoid -- to get away from conservative bounding analysis, a more realistic view of what's going on. Maybe that should be the very first bullet, even above the safety goals. DR. FLACK: Well, yes, I mean, you discussed that in the context of goals and adequate protection and -- DR. APOSTOLAKIS: You can even bring it up there, yes. DR. FLACK: I mean, that's really -- you know -- DR. APOSTOLAKIS: Oh, so you were planning to do that -- DR. FLACK: Yes. DR. APOSTOLAKIS: Yes. DR. FLACK: That would be part of what we mean by adequate protection versus -- DR. APOSTOLAKIS: I see. DR. FLACK: -- a safety goal. DR. GARRICK: Well, yes, if you do a job of marrying this list and its language to the background, like you said, you said when you were talking about safety goals versus adequate protection that there's language in the rules and regulations of a background of the type that we're talking about here. But just wanted to make the point that that's quite important, that background information. DR. FLACK: Yes; and I think when you flesh out why, you know, you have a PRA policy statement to begin with and the reasons that led you there -- DR. GARRICK: Right. DR. FLACK: -- were some of the reasons that you're describing now. So, I mean, there needs to be a lot more, again, meat on the bones here, but just as a structure, you're quite right. I mean, these things have to be brought up. I think that one of the shortcomings that I see is that we tend to jump into teaching students the tools and the methods, and you say okay, now, go forth and use them, but there's not enough up front discussion of why do you use risk? What is risk, you know, doing for you, and I think these are the kinds of, you know, things that need to be discussed; right. Okay; which gets me to the second part, and that's the what, and that is, you know, as part of that -- part of the course; it's to go through also the methodologies and what are the -- what they are, the concepts, the methods. And through that discussion, of course, of what risk is and then the general methodology that's used in finding out what that risk is and then the key modeling areas, and these configuration or with reactors, of course, it's as built, as operated plant or operating condition and what we mean by success and how that's modeled. Of course, human reliability is an important part of that methodology and common cost failure, accident progression and consequence analysis and external events. So these are all the key modeling areas that one would discuss in a course like this to get ideas to students on how or what is, you know, this is the what, what modeling areas there are out there that need to be done. And then, following that, of course, then, we get over to the data, statistical analysis and treatment of data, and there will be some of that as part of the course; basic treatment of uncertainties, and this, of course, is trying some limitations of the methods. So this would be in that context of getting out the methods, explaining to the students what methods are out there and how they're going to be applied will then follow. DR. KRESS: Go ahead, George. DR. APOSTOLAKIS: The uncertainty treatment in the statistical analysis, the biggest uncertainties, at least in reactors, are usually associated with models themselves, not the data failure rate kind of thing. Is that part of what you're going to discuss here, even though there are no methods really for handling those, but the model uncertainties really drive the whole thing. And I think a lot of staff's regulations come from that. DR. FLACK: Well that clearly, I mean, this is -- the assumptions that go into even the model need to be articulated well. DR. APOSTOLAKIS: Right. DR. FLACK: How you deal with that, it's helps to understand that in the context of the limitations of the process, that one is dealing with what we believe to cover the -- what the model is expected to cover based on the assumptions that go into the model, and certainly, that comes out; that would need to be discussed in the context of uncertainties. So once that groundwork is established, then, it becomes more the data uncertainties that we're talking about, but both uncertainties play an important role. You know, I tend to agree with you. I think that's one of the reasons why, you know, we're more in a risk-informed arena -- DR. APOSTOLAKIS: Yes. DR. FLACK: Because of that, you know. DR. APOSTOLAKIS: That's right. DR. FLACK: And it's really to open your eyes to look for these things and not to say you got them all, I think, is -- you know -- is the answer to that but -- okay; so, moving off from the methods, we get to the applications. And here's where I relied on my team members to actually provide us with a lot of input to this, so we've met with the training instructors, and we have identified some specific applications, and now, we're into how do you use these methods? And those are, you know, the four that we have established is fuel fabrication, one for fuel fabrication, one for transportation, one for nuclear materials and by- products, which you'll hear something about today, and radiological waste disposal. So each of these, essentially, has their own methodology. So we've developed sample applications of that and turned those over to the contractors, and now, the team members, each team member has a domain that they'll be interacting with the instructors on so at least the students who take this course see the practical applications of these methods as we're using them today in the NRC. Each of the examples that have been developed for those particular areas, of course, need to address certain elements, and of course, one is the differences in methodology; each has its own methodology, as mentioned; the key assumptions, which gets back to the modeling question; data analysis; the results and findings; what comes out of that analysis; and then, how do you use those findings or insights in decision making? So trying to get out already what's going on more globally within the office is really the intent of that, of those applications. Okay; that's sort of the how. DR. MARKLEY: John, just one question. In your developing an appreciation for the staff for the use of risk information and the analysis methods, are you going to be developing something comparable to the senior reactor analyst, where you have people in the various areas that will be more or less responsible for doing some of the analysis associated with it along those lines to support those activities in the field and stuff? DR. FLACK: Well, we haven't gotten -- we really haven't gone that far yet. We're thinking about -- if we're talking about significant determination processes and specific findings and how they're assessed, I believe that's an area that we still need to pursue. So we don't have, per se, like SRA types within NMSS. We have a group, though, and certainly, as things get more interesting and get brought in, what is the risk perspective of this, then, the group would take it more or less on, and then, the expertise within the group would deal with the issue. But there is not an assignment or a specific individual assigned like a similar -- DR. MARKLEY: But the group is a group of risk analysts, per se, or are they people with experience in those technical areas? DR. FLACK: Both. DR. MARKLEY: Both? DR. FLACK: Yes. DR. GARRICK: John, there is one thing that always bothers me about training syllabi that I see in the risk field, and they don't somehow match up very well with the real world in terms of activities. I look at your examples on your previous slides of methodology and application, key assumptions, data analysis, results and findings, insights and use and decision making. The most important activity in a risk assessment is what I would call the building the logic models, answering the question what can go wrong? It's probably 70 percent of the risk assessment, and yet, there is a kind of this idea conveyed out here that a risk assessment is primarily analysis of data. That may be 5 percent of a risk assessment. The thing that's really the tough, hard-nosed stuff in doing a comprehensive risk assessment is not clearly identified in that kind of a list. Now, it could be buried in the notion of methodology and application, but where the man-hours are really spent, and I speak to you from having directed over 40 of nuclear power plant PRAs, is in developing the logic models, the event sequence diagrams and the fault trees that answers the question what can go wrong? And that's where the real value of the risk assessment comes from; you know, people say that you can't do a risk assessment if you can't do data, if you don't have data, and of course, that's complete nonsense. DR. FLACK: Right. DR. GARRICK: Because that's not where the energy is consumed. The energy is consumed in understanding how the plant works. DR. FLACK: Yes, right. DR. GARRICK: And you understand how the plant works when you start answering the question what can go wrong, and the way you answer that question is through a structured set of scenarios. DR. FLACK: Yes. DR. GARRICK: And somehow, that just doesn't come through. DR. FLACK: Yes; no, I -- DR. GARRICK: And that's where the whole brilliance of a good risk assessment is. DR. FLACK: Right, exactly, right, right. DR. GARRICK: If you don't have people who understand how the plant works but might be the world's expert on data analysis; might be the world's expert on methodology and so forth, you're going to get a lousy PRA. DR. FLACK: I couldn't agree with you more. There's always limitations in a course like this. DR. GARRICK: Yes; I know, but I just couldn't resist it because -- DR. FLACK: Yes; I -- DR. GARRICK: Because there is -- DR. APOSTOLAKIS: I count this as something else. DR. GARRICK: Yes. DR. APOSTOLAKIS: If you have people who know the plants very well, and they don't understand data analysis and methodology -- DR. GARRICK: Well, I know, I know. I know, George, but that's there; that's there. [Laughter.] DR. APOSTOLAKIS: You know, just for the record. [Laughter.] DR. GARRICK: That's represented. But the real hard work of a PRA does not seem to be represented. DR. FLACK: Yes. DR. GARRICK: That's my point. DR. FLACK: It's hard to get that appreciation from someone just quickly going through one of these examples, and certainly, these examples that are presented take an enormous amount of time. I mean, there's a lot of resources that went into these, and in a couple of pages, you present that to them. DR. GARRICK: I understand; I understand. DR. FLACK: So it's just the fact that the people, to really appreciate it, need to do it. DR. GARRICK: Yes. DR. FLACK: And when they do it, they'll understand it, but it's to get them there. The whole idea is to get them there. DR. GARRICK: John, my main point is that a PRA is principally an engineering analysis problem. It's principally an engineering and operations analysis. And that is something that has to be emphasized in the training. DR. FLACK: Yes; I agree. DR. APOSTOLAKIS: Maybe under examples to include, there ought to be a bullet that actually addresses that. You know, I don't know how to put that -- DR. GARRICK: Well, the closest thing I know -- right; the closest thing I know, George, is something like logic modeling. DR. APOSTOLAKIS: Yes. DR. GARRICK: It's something that really gets -- gets it. DR. APOSTOLAKIS: Structuring the scenarios? DR. GARRICK: Yes; structuring the scenarios. DR. APOSTOLAKIS: I think that would be a good addition. DR. GARRICK: Structuring the scenarios. DR. FLACK: We're just writing it as methodology. DR. GARRICK: Yes. DR. FLACK: Structuring and scenarios. DR. GARRICK: Well, structuring the scenarios. DR. FLACK: Structuring of scenarios. DR. APOSTOLAKIS: Do you want to call them failure scenarios or -- DR. GARRICK: Well I -- DR. APOSTOLAKIS: Because it's not really -- we call them accident reactors. DR. FLACK: Yes; I understand. DR. APOSTOLAKIS: That's the word that applies to NMSS, the scenarios. DR. GARRICK: Scenario is okay. DR. APOSTOLAKIS: Scenarios. DR. GARRICK: Because that -- DR. FLACK: They will certainly cover that as part of the -- you know, like, for example, the event reanalysis, but again, the appreciation of actually doing it in the real world is -- you just can't get that across in the classroom. You have to summarize it, you know. DR. GARRICK: Yes. DR. FLACK: But certainly, we'll reemphasize that as another bullet. DR. APOSTOLAKIS: Good point. DR. FLACK: Okay; so, then, that leaves us with the last two areas of the training program, which would be the application of the risk insights to regulatory decision making, but it's not just generating things and not being used. And here, also using the tool once it has been developed to gain insights by performing sensitivity studies and bounding analysis, and that gives you more information about -- about whatever you're modeling the area; the impact of uncertainties through those sensitivity studies, and some of this last one, performance measures, is something that you would gain by doing an analysis and understanding what are the important measures, important performance measures to capture and then use, for example, to demonstrate that, in fact, you are achieving what the model is predicting, something that you could actually measure in the course of time. DR. KRESS: Somewhere in these six areas, you have a concept of how defense-in-depth fits in with a risk- informed or risk application processes? DR. FLACK: Well, other than how it would be discussed in the context of 1.174 up front, you know, what we -- you know, how it becomes one of the elements, principles of good regulation, I wasn't going to go venture into what we mean by that in the context, you know, of the applications at risk. DR. APOSTOLAKIS: You know, John, when you talk about uncertainties and their impact on decision making, maybe that's a good place to -- DR. KRESS: That would seem to me like the right place to put it, yes. DR. APOSTOLAKIS: Especially model uncertainties, because if you think about the, you know, motivation for defense-in-depth, compensatory measures is really the uncertainty you have. Now, again, you don't need to give them a whole treatise on it, but you have an item here insights and use in decision making, for example. That's where a discussion of defense-in-depth versus the rational approach to designing compensatory measures would belong. I mean, just as a thought. I mean, you don't have to decide now, but it's certainly an important issue. DR. FLACK: Oh, it is, but the question is can it be confusing within that context? DR. APOSTOLAKIS: It is already confusing. [Laughter.] DR. FLACK: Well, this is true, but it should be something that's borne out. It's -- there's almost you're working like between two worlds, one of being -- DR. APOSTOLAKIS: Exactly. DR. FLACK: -- deterministic and one of being probabilistic, and then, you're trying to bring the two together to have it make sense within the principles of 1.174, and the question is how does one articulate that in the context of a training course? A lot depends on the instructor, and it's not going to be me, which is unfortunate. Otherwise, maybe I could discuss this with -- you know, the problems and the issues that come out of how we deal with these as things we're trying to blend the two. But I guess in concept, what we're trying to achieve -- establishing defense-in-depth mechanisms to account for uncertainties, which is why you wouldn't want to remove the containment, for example -- certainly is an appropriate thing to talk about in the context of the course. DR. APOSTOLAKIS: The -- well, again, I mean, 1.174 is a pioneering document, but the way defense-in-depth is presented there is a third principle; the connection between defense-in-depth and risk assessment is not very clear. DR. FLACK: Right. DR. APOSTOLAKIS: And it's okay; I mean, that was the first time we wrote something like that. But I think a discussion of defense-in-depth in the context of the uncertainties that will come out of all of these analyses would be a more reasonable thing to do and explain at least the connection -- DR. FLACK: Yes. DR. APOSTOLAKIS: -- that -- DR. FLACK: And there is a structure to it, you know, I think. You know, as you become -- of course, as you go down further and further in the likelihood of events, you would have less and less defense-in-depth because of the low probability of the event, where events that you would expect that would occur, you would expect a lot of defense-in- depth, and that would still bring you down to this low probability. So, I mean, they kind of trade off against one another. As a reactor tripped, there's a lot of defense-in- depth to bring the plant to a safe shutdown. Of course, in the extreme, there is no defense-in-depth for a meteorite, for example; I mean, it just happens; that's it. So, I mean, in between, you have the whole spectrum of -- DR. APOSTOLAKIS: Yes; we are not saying that these issues are crystal-clear to everyone, but at least the discussion of defense-in-depth -- DR. FLACK: Sure. DR. APOSTOLAKIS: -- of the context of the uncertainties and their impacts on decision making, and that will naturally bring you back to the model uncertainty issue, because that's a major driver. I don't think there is any defense-in-depth measure that was placed there because the distribution of lambda was too broad, the failure rate. I mean, it's really the model itself that you worry about. DR. FLACK: Yes; good point; okay. DR. APOSTOLAKIS: And that's where, in fact, I would agree with John. I mean, that's where the availability of methodologists has actually done a disservice to the community; because there is so much statistical literature on how to handle failure rates, we can pay a lot of attention to that, when, in fact, from the PRA perspective, they are not the major drivers. DR. FLACK: Yes. DR. APOSTOLAKIS: Uncertainties in failure rates are not the major drivers, right? DR. FLACK: Yes; well, you tend to go to the area where it's the easiest to -- DR. APOSTOLAKIS: At the moment you assume it's exponential, you have already made a big assumption. DR. FLACK: That's -- okay; good, and the last one is, of course, risk communication. We were discussing how it is important to communicate the results, both internally and externally, and there is work obviously still going on in that arena right now, and we don't have all of the answers to it but to at least emphasize the need to do this as part of this course. Okay; so, those were the six areas of the pilot program. Then, if there are no further questions, I'll move on to the workshop. DR. APOSTOLAKIS: Let me -- risk communication; I know it's a fashionable term. Shouldn't it really be replaced by building trust or something? That's really what you want. You want the public to trust you, not just to communicate -- the risk communication is part of it. And I'll give you an example. In talking to people or laymen, they have no idea how you guys develop regulations and the extensive reviews and the public comment period and so on. And it seems to me that their confidence in the agency goes up after they realize how much scrutiny every document you produce goes through, and that's not part of risk communication. That's building trust. So I was wondering whether we can start talking about trust, of which risk communication is a very important part but not just risk communication. I mean, you have processes in place that really the general public are not aware -- they are not aware of. And yet, they enhance -- they should enhance the confidence they have in you. They don't know that this committee exists and that, you know, everything is aired in public, and we are free to say whatever we like, and the public is free to come and participate. DR. FLACK: Yes, right. DR. APOSTOLAKIS: I mean, that's important. DR. FLACK: Yes; yes, well, this really gets into external communication in general; I mean, how we communicate to the public. DR. APOSTOLAKIS: Yes; so, in fact, the thing you have in parentheses there, public confidence -- DR. FLACK: Yes. DR. APOSTOLAKIS: -- it seems to me that should be the heading. DR. FLACK: Instead of risk communication. DR. APOSTOLAKIS: And risk communication is under it. DR. FLACK: Well, okay, but we also have internal communication. DR. APOSTOLAKIS: Yes; that's true. DR. FLACK: Which is important too. DR. APOSTOLAKIS: Internally, you don't have to -- they know what the processes are; I mean, all you have to do is communicate risk. DR. HORNBARGER: You hope they already know. [Laughter.] DR. APOSTOLAKIS: All the evidence to the contrary, you know; 70 percent of you guys don't think risk- informed regulation will go anywhere. That's still true. DR. KRESS: Who took that poll? DR. APOSTOLAKIS: Was it the IG or someone? DR. KRESS: IG? DR. APOSTOLAKIS: Do you remember? DR. FLACK: Yes; well, in light of that I was thinking of how the information gets -- that's being used again; the risk group is a support group, and it's generally supporting a technical area. Well, how do you package the results from your risk studies so that the technical area can use that, capitalize on it? Some of the things that John had been mentioning has to somehow get across to the people that are going to use this to make the decisions. That generally will not be the risk analyst that makes the decision. It will be someone in the technical area. So that bullet was intended to somehow get -- to get the output from this analysis in a form that is usable and will be used by somebody making those technical decisions. DR. GARRICK: Also, it's a risk course, so I can understand why you would put communication or risk communication as a headline, even though I agree with what George is saying, that what we're really talking about here is how to build public confidence. DR. FLACK: Yes, clearly. DR. GARRICK: How to build trust. DR. FLACK: That second bullet is getting to the public that we're using risk. We're doing regulation in a smart way, and one of the things we're using is risk to do it and getting that point across to the public. I mean -- well, as we'll get into the workshop, you'll hear some of these public groups think risk is just the opposite; that it's a means for relaxing the regulation. And so, there's always this other element that's there that -- DR. APOSTOLAKIS: And if you talk to the industry, it's the other way. DR. FLACK: That's right; so it's -- there's always sort of a problem with that, too. DR. GARRICK: Let's talk about that workshop. DR. FLACK: Okay; moving on to the workshop, the workshop was held last week, April 25 and 26. There were approximately 50 people who showed up at the workshop. There were about 26 participants around the large table. The objectives of the workshop were to both inform and obtain stakeholders' input on two things. One was the criteria, which Marty had mentioned this morning, earlier this morning on how to decide whether a certain regulatory activity should be risk-informed, and the other was on the nuclear safety goals. As part of the Federal Register notice that went out, we provided the five-step implementation process out of SECY 99-100. We also listed the criteria, proposed criteria, and then, we listed a number of questions that we thought as food for thought that they would have behind their minds as they participated in the workshop. And I kind of structured the feedback along those questions we had asked. This view graph just provides an overview of the organizations that were represented, which gives a pretty broad spectrum of participants. Again, there was approximately 20 to 25 that sat around the table that participated in the discussion. The next view graph shows the framework that was outlined in SECY 99-100; the five- step process. The first was to identify candidate regulatory applications that are amenable to expanded use of risk assessment information, and we primarily focused on that one. The others kind of flow from that: decide how to modify regulation -- the regulatory activity, and we'll discuss a little about case studies that are being proposed on how do we do that, use that to decide how regulation might be modified. And then, once we ran a pilot, we would then think about changing the regulation and then implementing that new regulation and then what it would take as far as tools to make that happen. But I don't see these five steps as naturally occurring one after the other. There's a lot of feedback from, you know, what you do depends also on what tools you have and that sort of thing, so kind of -- there's a lot of feedback between the various steps, but we primarily, as part of the workshop, focused on number one. The next view graph paraphrases the screening criteria, and those -- that was three, three items on that. So if we were proposing a risk-informed activity, a new activity that would change the way we do business; the first one would say the new regulations should be at least -- it would have to at least address one of these: maintaining or improving safety; improving the effectiveness or the efficiency of the NRC process; and/or reduce unnecessary burden. DR. APOSTOLAKIS: Now, the moment you say reduce unnecessary burden, shouldn't you add a fourth item, add regulations where appropriate? DR. FLACK: Add regulations -- DR. APOSTOLAKIS: Well, I mean, the risk analysis may indicate that there are areas where you need more. DR. FLACK: Oh, well, I -- DR. APOSTOLAKIS: Because this, you know, regarding communication with the public, I mean, we're overdoing it with the reduction of, you know, unnecessary burden, and, I mean, that's not the purpose why we're doing all this. DR. FLACK: Right. I see the first bullet and the third bullet working together, you might say. When you bring them together like that, you kind of say we're doing smart regulations. If it results in a decrease; that's fine. If it results in an increase, that's fine. We're just doing it the smartest way we know how to do it. DR. APOSTOLAKIS: But what you just said is great, but it doesn't come across. DR. FLACK: Yes, it -- DR. APOSTOLAKIS: So, if you decide to put a bullet reduce unnecessary burden, it seems to me you have to have something also that says that you may add something if you find there is a hole someplace or just delete it. DR. FLACK: But the question is if we add that as a bullet, if we say, well, it meets that bullet, that bullet is linked to one of the other bullets. It's not by itself a bullet. Like you would say I would add a regulation, but it would need to either reduce burden or improve safety or something like that. DR. APOSTOLAKIS: Something like that, yes. DR. FLACK: Right; I mean it's -- DR. APOSTOLAKIS: I would make it one bullet; you're right. DR. FLACK: So we would have to sort of implicitly be that and one of the others. DR. APOSTOLAKIS: The reason why I'm saying this is I don't really think it affects the substance of things, but the last three or four years, we've been talking a lot about reducing unnecessary burden, and the agency has been criticized that the reason why we're doing all this is to reduce burden, and that's not true. That's not true. DR. GARRICK: It's a by-product of the process. DR. APOSTOLAKIS: It's a by-product, and for 20 years, we've been using PRAs to add regulations. DR. FLACK: Well, that's -- DR. APOSTOLAKIS: And, of course, people conveniently forget that. DR. FLACK: That's right, and I think it's only saying, well, at this point, I'm going to go back and look. We haven't taken anything; we've kept adding and adding, and maybe there's things that got superseded and something that could be -- and so, it's really a -- well, we went through this phase, then, to -- let's look at it from a different perspective. What is out there that we can reduce; since we've been adding and adding and adding over the years, is there anything now that's not just worthwhile going and implementing? And now, I think we're coming back to the point as though we're doing it as a group thing. When we look at something, we look to see if there's -- safety can be improved or we can reduce, and I think they come together now at this point. We're not looking at one or the other; we're looking at both simultaneously. DR. HORNBARGER: In fact, that strikes me that your first bullet there is essential. It's not one or more of the following; the first always has to be satisfied. DR. APOSTOLAKIS: Right, and the second, too. DR. MARKLEY: George, I think as long as you're relying on licensees to identify the initiatives, you're going to end up with burden reductions and not increased regulations or enhancements. DR. APOSTOLAKIS: But, Mike, I'm not questioning the actual practice. I know that we're trying to do the right thing, and we are most of the time. I'm just addressing what's written there. DR. MARKLEY: Right. DR. APOSTOLAKIS: And because of the recent criticism, I mean, I have no -- there's no question in my mind that we are not doing this just to reduce burden, but we have been criticized that we are overdoing it with the reduction, and we keep talking about it all the time, so either delete it or add something to the effect that if necessarily, you know, we will add something. We will add unnecessary burden. [Laughter.] DR. HORNBARGER: No, no, you make it completely parallel, and the fourth one is add necessary burden. DR. APOSTOLAKIS: Right, right, right; that's exactly it. DR. FLACK: Add necessary burden. DR. APOSTOLAKIS: Yes. DR. FLACK: And I think the point is well taken, as you'll see in some of these comments back from the workshop -- well, one of the public citizen groups felt that the last two of these criteria should not be part of it. DR. APOSTOLAKIS: That's right. DR. FLACK: They only believe that the first one should be part of it. And that was their position. Obviously, that wasn't shared by everyone in the room, but that's where they were coming from. But we'll take that -- you know, we'll think about that, what you said, and see if we can change -- see what we can do to clarify the meaning of that first -- DR. APOSTOLAKIS: Okay. DR. FLACK: -- first criterion. The other two, of course, are that there needs to be data and analytical methods available or able to be developed if you want to make that change work, and then, of course, it's the implementation, whether it can be realized at a reasonable cost. Again, this looks almost like the why, what, how, whether you start with the why are you doing it and then what, what it is that you plan to do and then how are you going to do it; it sort of flows in the same kind of logic. So that was the criteria that was proposed, and there is a comment period that's still open, and we're still waiting to hear from others on that as well. I don't know if we'll get -- DR. APOSTOLAKIS: Well, again, in the context of the criteria, though, I don't know what it means maintaining or improving safety. The criteria are -- will tell you whether you should risk-inform a particular activity, right? That's what Virgilio told us earlier. The screening criteria; what's the purpose of the screening criteria? To decide -- DR. FLACK: To see if -- yes, to see if there is an area of the regulation that could be risk-informed and then why would you -- DR. APOSTOLAKIS: How can maintaining safety be part of the criteria? I don't see that, and, you know, I just don't see that. DR. FLACK: It could be part -- well, okay. DR. APOSTOLAKIS: I mean, that's a consideration when you decide to do something, and you're asking yourself now, what should I do? Well, then, you say of course, whatever I do, I have to maintain safety. But it is -- or improve it, but it's not part of the screening criteria, I don't think. DR. FLACK: Well, it almost says that if, you know, it goes hand-in-glove with the last bullet, which says that if you have a proposal that you wish to reduce burden that you would at least maintain safety while doing that or improving safety, but that would be a win-win situation. DR. APOSTOLAKIS: Right; but this is a consideration after you decide that there is unnecessary burden. DR. FLACK: That's right; you would first decide that -- yes. DR. APOSTOLAKIS: It's after that. DR. FLACK: Yes; you would first observe that. DR. APOSTOLAKIS: Your first decision, the maintenance of safety really is irrelevant. DR. FLACK: Yes; it's almost like an overarching kind of principle that -- DR. APOSTOLAKIS: Yes; we always want to maintain safety. DR. FLACK: Yes, right. DR. APOSTOLAKIS: There's no question about it. DR. KRESS: A criterion might be does this particular regulation have a high risk or a high impact on safety. That could be a criterion. DR. APOSTOLAKIS: This regulation; no, you're looking at activities, right? DR. FLACK: Activity. DR. APOSTOLAKIS: And you're asking yourself should I risk-inform this part of -- the regulation of this activity. DR. KRESS: Yes; then, you're going to -- you're going to risk-inform all the activities. DR. APOSTOLAKIS: Yes. DR. KRESS: The question is which parts of the regulations? And you look at the individual regulations and say does this have a high impact on safety, a high impact on the safety of this particular area? That could be a criterion. DR. APOSTOLAKIS: Why would that be a criterion for risk-informing it? DR. KRESS: Well, if you have to -- if it was low- impact, there's not much reason to risk-inform it, maybe. Yes? DR. ROSENBERG: Hi; I'm Stacy Rosenberg. I'm in the risk task force. The first criterion is to resolve a question with respect to maintaining or improving safety, so that's what we're looking at is if there is a question -- DR. APOSTOLAKIS: Oh. DR. ROSENBERG: -- with respect to maintaining or improving safety, we would want to use a risk-informed approach to try to answer the question. DR. APOSTOLAKIS: Then it should be restated here to make that clear. DR. ROSENBERG: That's just paraphrased from the -- DR. APOSTOLAKIS: In other words, there is an issue that has come up -- DR. ROSENBERG: Exactly. DR. APOSTOLAKIS: -- that creates a question -- DR. ROSENBERG: Right. DR. APOSTOLAKIS: -- whether safety is maintained, and then, that can be -- and the thought is that by risk- informing the process, you will be able to place that issue in perspective and maybe resolve it. DR. ROSENBERG: Right, right. DR. APOSTOLAKIS: Oh, okay, yes. DR. GARRICK: But isn't this just parroting the background information to the PRA policy statement? These words have appeared many, many, many times and in this order, too: maintain safety; make the NRC more efficient; and reduce licensing burden if it can be justified. DR. APOSTOLAKIS: But, John -- DR. GARRICK: Yes. DR. APOSTOLAKIS: -- they are used there after you decide to risk-inform the regulations. The Commission is telling you do these things; risk-inform the regulations, and in the process, make sure you are maintaining safety and so on. But here, they're deciding whether to risk-inform. And that's why I raised the question, but after the clarification, I think if you change the words, then, it's okay. DR. FLACK: Yes; that should have been the question up front. DR. APOSTOLAKIS: Yes, that is the question. DR. FLACK: That's on the table that you were going to address. DR. APOSTOLAKIS: Makes sense. DR. FLACK: And then, you would just bring the fourth as a question. Of course, it's also being exercised in the other arena as well as part of the policy statement. DR. KRESS: Yes. DR. APOSTOLAKIS: The other thing is the data. I don't know if you like that, John. I mean, if we don't have data, we don't do risk assessment. DR. KRESS: Similar to your question, George, the second one there: improve the effectiveness of -- you know, that's a weird criterion in a sense that almost anytime you risk-inform the regulations, it's probably going to improve efficiency. The criterion ought to be maybe does this area have a high impact on the -- DR. APOSTOLAKIS: Yes. DR. KRESS: -- effectiveness or efficiency or something like that. And it seems like improve is not the right criterion. DR. APOSTOLAKIS: Well, I think the words were borrowed from -- DR. KRESS: Yes, I think the -- DR. APOSTOLAKIS: -- documents that John Garrick mentioned without really adapting them to the fact that you are talking about screening criteria here, whether to risk- inform or not. And the context they have been used in the past is after you decide to risk-inform, make sure these things are right. But number two, I mean, why would that be a consideration in the screening criteria if the principle is -- and that's why we have fault trees; that's why we have all of these things. If you don't have data, you go deeper and deeper until you get some evidence that will be used, and the availability of data was an issue in 1970 when people were telling other people that the reactor safety study would never be completed, because you don't have the data. And then, it was completed. So I don't know that number two means anything. That's why we do risk assessments, right? DR. GARRICK: Yes; I don't have too much problem with it when it's in the context of analytical methods, when it's data and analytical methods. You know, the analytical methods are what allow us to turn up the microscope on the system such that we can see it down to a level of where some evidence, some data exists. So it doesn't have the same impact as it would be if -- as a statement that you often hear, namely, you can't do a risk assessment because you don't have the data. DR. APOSTOLAKIS: That's right; that's right. We need data. DR. FLACK: In the context of whether or not data becomes important as part of your decision, if you need to collect additional data. DR. APOSTOLAKIS: They're always important. DR. FLACK: Yes. DR. APOSTOLAKIS: Data are always important. DR. FLACK: Okay. DR. APOSTOLAKIS: The existence of data should not be a driver. DR. FLACK: Yes, that's true. DR. GARRICK: Information, yes, perhaps. DR. APOSTOLAKIS: But data at what level? DR. FLACK: Yes. DR. APOSTOLAKIS: See, it can be misinterpreted. It may be at a much lower level. DR. GARRICK: Evidence is another word that would work. DR. FLACK: Evidence? DR. GARRICK: Yes; evidence and analytical methods exist. DR. APOSTOLAKIS: I would say just analytical methods to make it clear. DR. GARRICK: Yes. DR. APOSTOLAKIS: Because it's too subtle otherwise. DR. GARRICK: Yes. DR. FLACK: Okay; any other questions or comments on the screening criteria? DR. APOSTOLAKIS: If I were picky, I would ask you what is reasonable cost, but I am not. [Laughter.] DR. FLACK: Moving right on -- [Laughter.] DR. FLACK: Okay; so, the next -- these are the comments, actually, that we received on the screening criteria. First one is not a surprise; any new requirements should be established using a risk-informed approach. I think there was general consensus on that around the table. Other comments: a risk-informed approach should be pursued if it would lead to improvements in the effectiveness or efficiency of either the NRC or stakeholders process. DR. APOSTOLAKIS: That's a good comment. DR. FLACK: Yes; so they felt that that should be added. Maintaining or improving safety should be the primary focus, and this gets back to the comments we just discussed about the public citizen groups supporting only the first of those three bullets; all other issues were secondary. And that the costs of the public and society need to be considered in deciding to risk-inform a program, and it was very sensitive that the public should be considered in any burden reduction assessment, so it was a sense of communication. They wanted to be on board, particularly these public citizen groups, to any kind of reduction that would be forthcoming from the activities that we're proposing. So, those are four comments. Some other comments; as Marty had mentioned, a lot of the comments were picked up as part of safety goal discussions and not so much the criteria, but there were these other three comments which we thought were important enough to put forth to you: that the area suggested for examination include -- and this is for risk-informing -- include broad scope licensees, unsealed sources, sealed sources, engagers in transportation. Those were the areas that were identified as being ripe or to be looked at for risk-informing activities, regulatory activities. Also implementation and associated training requirements needs consideration; again, that was brought up as if we're going down a risk-informed path that we need to be thinking of training to agreement states and that the inspection process should always be thought about while we're doing this, and it should fit hand in glove with the implementation. So as you go down in parallel, think about how you're going to do this in implementation space. DR. GARRICK: John, we have a break scheduled in the middle of your presentation somehow. Could you advise us on what would be the -- DR. FLACK: This would be the greatest time to take it since we're going into safety goals next so -- DR. GARRICK: Okay; then, I think if it's agreeable to the committee, we'll take our break right now. [Recess.] DR. GARRICK: All right; I want to make an announcement because we're trying to accommodate one of the committee members here in being present during one of our discussions. You will note on the agenda that item seven is a discussion of joint subcommittee protocols, and one of our committee members has to leave prior to that scheduled time. So what we would like to do is move that topic up to the lunch period, and we'll have it right here in this room. We'll break long enough to get a sandwich or something, but we will try to, while having our lunch, have that discussion as much as we can to take full advantage of the full subcommittee. So unless there's any problem with that, that's the procedure we'll follow, and if there needs to be follow- on discussion, we'll have that at the designated time as item seven on the agenda. DR. APOSTOLAKIS: But we expect to finish a little earlier, then. DR. GARRICK: Yes; it appears that we might be able to finish a little earlier. Okay; go ahead, John. DR. FLACK: Okay; so, now, we'll move on to the second -- what was discussed at the second half of the workshop, safety goals, and Gary Hollohan was the one that presented this nice, interesting phrase from one of the philosophers: there's one thing stronger than all of the armies in the world, and that is an idea whose time has come, and for that being safety goals for material and waste. DR. APOSTOLAKIS: Now, Victor Hugo also wrote Les Miserables. [Laughter.] DR. APOSTOLAKIS: We are not implying anything by that. DR. FLACK: Okay. What do we mean by safety goals? To establish a nuclear safety goal that broadly defines an acceptable level of risk to the public and in this case also the worker, which is somewhat different or an extension, you might say, of the scope of reactors. Okay; first question that had been posed as part of the Federal Register notice was what people should think about their perceptions of what material safety goals are and what they would achieve and whether or not it would be supported by various individuals, and I believe in general, the first bullet held true, that it was -- that most, if not all the people, at the table believed that it would be a worthwhile endeavor. So it is getting support out there to move in this direction. It was generally -- there was a general consensus that the goal should be qualitative in nature at the highest level, and then, there was some discussion on how it would be implemented, whether it should be quantitative or qualitative, and I think that would have to be a, you know, exercise or understood in the context of an application, but there was pretty much agreement that the goal itself should be qualitative. DR. APOSTOLAKIS: Why is that now? DR. FLACK: Why is it? Why would one think of it as being qualitative? DR. APOSTOLAKIS: At which point would the Commission say these are the quantitative goals? DR. FLACK: Well, I think at least my opinion would be that it would be more on how you implement it; that the goal itself is more philosophical in nature, and I think you'd find that Gary Hollohan feels very strongly in that direction as well, that the goal being a philosophical goal by its very nature would be qualitative. DR. APOSTOLAKIS: Well, the quantitative health objectives were quantitative. DR. FLACK: Well, only when you got down to the 0.1 percent; you mean for the reactors. DR. APOSTOLAKIS: Yes. DR. FLACK: Yes; then you were into implementation; what do we mean by this, no more risk to the population and then came with well, what do you mean by that, or limited risk to population; well, then, it comes down to, well, how do you implement it? Well, then, the next step would be something quantitative. DR. APOSTOLAKIS: But it was part of the safety goals statement. DR. FLACK: You mean at the highest level, at the highest level? DR. APOSTOLAKIS: Yes, I think so. DR. FLACK: It's not quantitative; it's qualitative, yes. DR. APOSTOLAKIS: It starts out with a qualitative statement. DR. FLACK: Statement, yes. DR. APOSTOLAKIS: And then, it says for reactors, it should be one-tenth of 1 percent of all -- DR. FLACK: Right. DR. APOSTOLAKIS: -- other risks, which is part of the -- DR. FLACK: Well, okay, I see what you're saying. You say you take it as a package deal. DR. APOSTOLAKIS: Yes; I take it as a package. I mean, that's a statement of the Commission. DR. FLACK: At what time do you make that quantitative link? Is it still at that level? DR. APOSTOLAKIS: I remember Gary was objecting to putting core damage frequency and surrogate goals like that in the -- DR. FLACK: Yes. DR. APOSTOLAKIS: -- top level statement, but there has to be some quantitative statement even at the, you know, in the Commission's statement on safety goals. Otherwise, the staff will have no guidance how to do it. I mean, you can't put that in a regulatory guide. DR. FLACK: No, I agree. I think that when it comes down to the practicality of it all -- DR. APOSTOLAKIS: Yes. DR. FLACK: -- it has to be something quantitative, and it's that the numbers represent something. It's not that, you know, you need to achieve it as a requirement, but it tells you this is what I think it should be, and you can't argue with that. I mean, once you write down 0.1 percent, everybody understands what that means. It takes away words like being reasonable and so on. This is it. So I personally feel strongly about that myself. However, and I don't know if everybody agrees with that. DR. GARRICK: Well, maybe you've already mentioned this, but it seems that the distinguishing words here is qualitative safety goals versus quantitative objectives. That's what I'm reading from the rules and regulations. The qualitative safety goals is this general statement about it doesn't want -- nuclear power shouldn't add any significant risk. The quantitative objectives is when you get into the 0.1 percent. DR. FLACK: So there is at this -- that separation -- DR. GARRICK: Yes. DR. FLACK: -- of the two at that point. DR. GARRICK: So they make the distinction -- DR. FLACK: Yes, okay. DR. GARRICK: -- by separating goals from objectives. DR. FLACK: Yes; okay; if that's the one way of looking at it then. So that's the break at that point. So you have the goals, and then, you have objectives. DR. APOSTOLAKIS: Yes, but, you see, the third bullet should help define the objective of the regulation. Is that what you mean? DR. FLACK: This was in the context of the goals themselves would help us do the, that that would be the mechanism. DR. GARRICK: I suspect that that was -- oh, I see; this is work -- DR. FLACK: These are other comments -- DR. GARRICK: Right, right. DR. FLACK: -- that were made at the workshop, and I think that that was made in the context they're worth pursuing because it would help define the objectives of the regulations. DR. GARRICK: All right. DR. FLACK: But then, if you say if I'm going to sit down and develop safety goals, then, how would I develop them? Then, you would start with some overall philosophical statement and from there go to your objectives. DR. GARRICK: Well, maybe the approach here is one step at a time. DR. FLACK: Yes. DR. GARRICK: Maybe we ought to resolve the issue of a qualitative goal and go from there and -- it may not be a bad idea to adopt that as a path of progression, and if it can be tightened or bettered or improved on, obviously, you'd want to consider alternatives for doing that, it seems. DR. FLACK: Yes; I was thinking the committee could be very helpful in letting us understand that part. DR. GARRICK: Yes. DR. KRESS: Well, one of the admirable qualities of the PRA and a risk assessment is it's quantitative. And if you're going to make full use of that attribute, you need to have quantitative risk goals, and I don't know how bound you are to workshop feedback. I mean, these are things you take into consideration. DR. FLACK: That's right. DR. KRESS: But I certainly would say somewhere along the line, you need to have quantitative goals that are -- that are expressed in risk terms related to PRA or risk type activities. Just almost seems like you can't function in a risk-informed world without it. DR. FLACK: Very well. DR. KRESS: Even 1.174 has quantitative things in it that they use. DR. FLACK: That's true; I think most methods that you would look at -- I'm thinking about them -- ISA, PA, you know, barrier, hazards barriers, target analysis; that sort of thing, all involve some sort of quantification. You're ending up with a quantified numerical result at the end. And then, the question is how do you link that numerical result to some higher level goal that you're trying to achieve, and, you know, if we're thinking about risk of the public out there, which is basically where it comes down: how much risk -- the fact that we're using nuclear materials means that we're going to expose the public to some risk. I mean, they just can't avoid that. So the question is then how much is acceptable risk, and that's where you begin to get into what percent are we talking about to the public and then the worker, for that matter, because we're dealing with both of these more in this arena, and structuring it that way so then, the natural tendency would be, of course, once you come down to a percentage, then, you're there; essentially, this is what we think. And then, you can use different methods on seeing where you lie relative to that goal and what you could do if necessary to meet the goal but not as a requirement but things that you could do to improve the regulatory process in doing that. DR. GARRICK: Tom, what about another thought process here? Suppose we took this in a kind of a phased approach and said, well, the first thing we maybe ought to be doing is doing some risk assessments, some quantitative risk assessments and seeing what kind of results we get, what constitutes a rational form to put the results in, et cetera, et cetera, and give some experience in developing this measure before we necessarily freeze on how we want to calibrate it? In other words, you can always calculate the risk without a goal. You don't need a goal to calculate the risk. But you learn a great deal in the process of doing risk assessments about what you can do and what you can't do about what the contributors are, about what the uncertainties are, and it's possible that that kind of information could be extremely beneficial in the calibration process. DR. KRESS: Yes; I support that very strongly, and I'll tell you why I view a goal or an acceptance criteria as a completely separate entity than what you're talking about. DR. GARRICK: Yes. DR. KRESS: If you do what you said first, it tells you what's possible. DR. GARRICK: Right. DR. KRESS: And that odds somehow enter into your decision on what a goal or acceptance criteria you might have. It tells you, you know, if your risk assessment of some sort of activity gives you a number, and you set an acceptance criteria that's impossible to achieve -- DR. GARRICK: Right. DR. KRESS: -- well, you haven't done very much. DR. GARRICK: That's right. DR. KRESS: And I think it helps guide doing it both ways. DR. GARRICK: Right. DR. KRESS: But I think you set -- I mean, the basis behind acceptance criteria stand alone, in my mind, and can be developed as a separate activity, but you've just got to be careful you don't put a value and shoot yourself in the foot, and your activity helps keep you from doing that. DR. GARRICK: Yes. DR. FLACK: Okay; some of the other comments that safety goals would help communicate what it is we're trying to achieve, certainly, articulates that to the public. Then came a lot of discussion about whether it should be application specific versus, you know, global that captures all of the different areas, and we could talk about that a little bit more when we get to the case studies, but many thought that it would probably be more than one goal that we were talking about. That the development, of course, would be a long and involved process, and people recognize that. It wasn't something that needed to be done because -- at the moment, in a short period of time, to do something. But it's something that I guess people felt they were more patient, they were more patient with and that it was an evolutionary kind of goals that would be developed in that. It would be a long process. And that was satisfying, I think. The relationship between safety goals and strategic goals; we talked about some of that before. It really needs to be articulated what we mean by strategic goals, what do we mean by safety goals, performance goals and how they all relate to one another was a comment that was made. And so, these were all in the perception of what we would use or envision safety goals to achieve and the benefits from doing that. The next question focused on the developmental process of the goals themselves and that we should try to understand the goals underpinning our current regulations as part of that developmental process; to use case studies; to develop safety goals and then, while doing the studies, actually capitalize on their insights. So that leads us to development of risk-informed approaches in parallel with the safety goals, which is something that Marty had mentioned this morning, so we're not really waiting for the safety goals to be developed; there's a lot to be gained just from working through the process and seeing how risk plays out within that context. There was some sensitivity to the desire for consistency with and among agreement states and that there may be different values across the different states, so it may not be that easy just to have them all agree on what we mean by a goal, but this was one of the issues that kept coming up at the workshop. DR. GARRICK: Were the agreement state representatives vocal on the issue of safety goals and probabilistic approaches? They have been in the past; I just -- DR. FLACK: Vocal in the sense of -- DR. GARRICK: Vocal in the sense that most of them were not favorably disposed toward doing PA probabilistic performance assessment, for example. DR. FLACK: Oh, well, I guess we never got down to the level of where, you know, to discussions of the tools and the methodologies -- DR. GARRICK: Okay. DR. FLACK: -- in order to implement it. It was more at the philosophical level, whether we need goals or not, so maybe that may become an issue. I know that resources, there's always a problem, and it came also up in a context of well, if we go ahead with this, what about training? How do we make it work. DR. GARRICK: Okay DR. FLACK: So, yes, I think that's the next level down; that's something that we're going to need to be concerned about. And that gets to the next one, which is ensure regional and local involvement, and that is, again, trying to lead to goals that are consistent across the different regions and localities; to certainly hold workshops, public meetings in the diverse regions and involve stakeholders early in the process, so these are things, again, that they want to know what's coming; they want to be involved in. So there was a lot of support in that regards. Okay; what factors -- next view graph is what factors need to be considered during the development of the safety goals for materials and waste? And there's some really sticky ones here, I think, the first one being national versus local values again, the diversity across the nation with respect to that. And then, came the question of ecological risk, and that was discussed this morning in the context of reactors: where are we going here that's different than reactors with regards to that risk contribution? And then, we're looking at operational versus accident risk, so you're looking at these two things together, where with reactors, it's primarily accident risk that we're dealing with. And then, you have worker and public risk. So, we look at the worker and what the worker is exposed to with respect to his job and the risks that he would normally be exposed to as part of that versus the public, who is outside that area and what would they normally be exposed to, and then, the goals would certainly be tailored differently between the two, it would seem. Harmonization among and with other agencies; that was something that came up this morning about the other different agencies and having different, possibly different objectives. And then, there's a question of legislative requirements and what legal ramifications there are in developing such goals, and we'll probably hear a little bit from Bob on the next one: risk to future generations is sort of associated with waste disposal, so I'll let -- Bob Bernero is here, and he will certainly want to discuss that a little bit further. Also, the risk associated with theft, sabotage and diversion of nuclear materials. Now, we have a different type of risk, the risk of diversion of materials that's possibly to be considered in the scope of a type of safety goal. And then, you have the risk associated with chemical toxic releases to the environment as part of the activities that go on at these facilities, for example. And then, there's always the hidden considerations that may be embedded in the regulations that we have to be sensitive to when we look at the regulations and say this is what led to X. There were some considerations that were obvious, but then, there may be others that were hidden in those decisions that got us there, so it's not going to be a simple process. There are quite a number of factors that are going to be involved in considering safety goals for this area. On the next slide, we ask the questions about the analogy between the reactor safety goals and the development of material safety goals, material and waste disposal and safety goals, and clearly, at one level, there is an analogy, and that is the radiological risk to the public, although the criteria might be different: what risk are you exposing the public to? So, I guess that's as far as it goes as being analogous. The rest are the worker risk, the ecological risk, the risk for diversion of materials; these would all be different. These are somewhat different areas that are not presently being captured by the reactor safety goals at least in the implementation for reactor safety. And then, five: the question of whether these safety goals should be overarching, or there should be safety goals in each of the areas, and this led to discussion, and it seemed that there was a general consensus that one goal wouldn't capture everything; that there would be a need for separate goals and that we should use these case studies, which I'll get into in a moment, the different areas to explore safety goals in these different areas. But, you know, at the end, there should be something that brings them all together, and so, although we may go down different paths, once we're there, we may find that there is a next level up that we can kind of pull these goals together at some higher level; it certainly can be entertained, but it may not be so easy to start from that point. DR. KRESS: The reactor safety goals are -- takes precedence to the mean values, which implies to me some sort of statement about the uncertainties. When you choose a 0.1 percent value and say this in the assessed version of this, you want it to be the mean value at this thing, what you're implying is that given the level of uncertainty in that assessment, this is an acceptable value to you given that this is a confidence level that you're willing to accept. Are you going to have some thoughts about confidence levels or uncertainty in terms of these goals? Are you going to say -- are you going to stick with this mean value concept? How are you going to factor that sort of thinking into that? DR. FLACK: And the question applies, you know, it's hard to answer a question like that without an application. I would say, of course, the mean value is the preferred value just from the mathematical beauty of it, and it captures the thoughts on your uncertainties and so on, and when you deviate from that, you're using, you know, a certain degree of judgment as to what, okay, if we're going to go to something else, why are we going to something else? DR. KRESS: Well, I personally don't view the mean value as any unique position on the distribution. It is a unique position because it's the mean, but it doesn't have any special meaning to me other than that. DR. FLACK: The mean's the mean. DR. KRESS: Yes, it's the mean. DR. FLACK: Yes; I recognize that. DR. KRESS: But, you know, median is just -- DR. FLACK: That's another concept. DR. KRESS: And any 95th percentile or anything is, you know, just as unique. DR. FLACK: Well, whatever measure is chosen, it needs to represent something, and it's in its representation that it becomes important, and it's not so much that we want to try to achieve this value by comparing the mathematical models and the results of those models to it so much on a numerical basis, but it's something that comes out of how we try to articulate our view on safety. What are we doing? What does it represent as a value? And it leads you back again to the percentage. When we talk about a certain percentage of risk to the public, and then, the question is well, how do you demonstrate that what you're measuring is the risk and that it is indeed a certain percentage of what the public is exposed to? Well, you're comparing again two values. It's not just a value; it's a comparison of values. And so, when you're comparing values, one has to be more or less consistent with that comparison, and comparing means is one way to do it. There may be other ways of doing it, but it's a good question that really needs to be fleshed out as, you know, part of the studies, I think, that needed to be done, and the models that are developed and how we represent risk out there that the people in the public or the worker might be normally exposed to, so it's coming to grips with that within that context. So any other questions on factors to be considered? [No response.] DR. FLACK: Oh, I'm moving ahead already on this one. Safety goals -- oh, okay; five and six, I think we discussed. How resource intensive -- we mentioned that earlier, that it's too soon to tell exactly how resource- intensive this project is going to be, but it certainly will be long-term and involved. And one comment is, well, try to do the easiest task first. Go after the low-lying fruit, so you can be somewhat efficient and effective in the way that you would attack the problem and also that as part of the resources that NRC needs to again consider -- I must have mentioned this three or four times already, the training needs for the agreement states. So it will not be a simple task. Number seven, what will ultimately change if we have goals from what we're doing now was a question we had posed out there, and there were comments. Most people believe that by putting goals out that we would likely get safety improvements and relaxations where requirements do not contribute to safety, so by knowing how safe is safe enough is a two-edged sword there, but people believed, and I think most people believe that it would make us look at things differently and things would change one way or the other. It would certainly help consistency regarding the regulatory process because people then understand what the goals, the objectives of the registered, what we are trying to achieve as a regulatory agency. There is potential for savings, since you will focus resources on those areas that will most help you reach the goal, and it certainly would enhance communication by allowing the agency to express their expectation of what is safe enough. So those were the comments that were generated on the seven. And that pretty much covers the spectrum of comments that were mentioned at the workshop with respect to those questions we had asked. So if there's no other questions on those, I'll just mention the case studies that were suggested, and these are the different areas in which we were entertaining further work to define -- for testing the screening criteria and the value added in risk-informing a focused area and in developing specific safety goals, so again, this would be an in parallel kind of activity. One area is, of course, waste disposal, high-level, low-level and decommissioning. DR. APOSTOLAKIS: Is that high level waste disposal area risk-informed already? DR. FLACK: Is it already risk-informed? Well, we could ask Chris that question if Chris wants to entertain it. I would think so. I would there there's a great deal of risk already embedded in that regulation. It would just be a matter of going the next step up with it and saying, well, you know, what fraction of the risk to the public are we talking about and whether that's getting us there. I don't know about specifically whether we could define it as a goal. I mean, there are certain requirements in the regulation that are being established, and presumably, if you meet those regulatory requirements, you would meet some goal. But I don't know; Chris, can you shed any light on that? MS. LU: This is Chris Lu. I'm also a member of the risk task group. The final part 63 now is in front of the Commission for consideration, so I can talk from the proposed rule that we put on the street. We do require, for post-closure period, we do require a performance assessment to be conducted, and in the proposed rule, we are looking at the mean peak dose over 10,000 years as the compliance point. In terms of the safety goals, during the workshop that we had last week, a couple of the participants pointed out that since we do have a risk-informed performance based rule, we can imply what is the safety goal from that using the regulatory criteria and requirements. DR. GARRICK: So the answer to George's question is yes. DR. FLACK: It may very well be. DR. GARRICK: It seems. DR. APOSTOLAKIS: This is the most risk-informed activity in the NMSS, isn't it? DR. GARRICK: Yes, yes. DR. APOSTOLAKIS: At this point, and the EPA criteria are also or used to be. DR. GARRICK: I'm very top-down oriented, and as I look at that list, I kind of come out that as far as safety goals are concerned and risk assessment is concerned that 90 percent of the problem is number one. That's the bad news. The good news is that we're well advanced in number one over all the others in terms of the use of risk based methods. You know, if you think about it, large process facilities cast some packages and associated transportation; there's no reason why we can't basically use the reactor risk assessment technology to deal with those kinds of problems. I also think that on medical uses and fuel sources, the NMSS with the studies they've done recently are in pretty good shape in terms of understanding what the risks are and have done a very good job, in my opinion, of pointing out the relative contribution of operational risk and accident risk and concluded the thing that a lot of us in this business have been saying for a long time, that the real risk is operational risk. It's not accident risk. So that kind of brings us back to as far as safety goals are concerned and what is the real threat to the public and the concern, and the achilles heel of the industry is still number one. The rest is kind of no, never mind almost, and I sure hope that our resource allocation and our problem resolution emphasis reflects on our state of knowledge about that. One of the questions this committee asked very early on is that NMSS should tell us, on the basis of their expertise, what they think the real risks are. Well, I think they've done a pretty good job of that with respect to sealed and unsealed sources and medical uses of isotopes and with respect to the by-product problem. Beyond that, except for the repositories, the problem can be handled in a very -- in a fashion very similar to the reactors in the sense that it can be based largely on reactor risk assessment technology. So -- and when you come to the repository, then, the question of differences there become critically important. It's not an accident issue on repositories. It's a long-term performance issue. And I'm not sure we've gotten that message out to the public. I could almost argue there's not a safety issue, and the reason there's not a safety issue is that we have time to interdict and to intervene. We don't -- we are not caught by surprise by a big accident. It's a long-time deteriorating, degrading process, so the driver as far as public risk is concerned on repositories is the denial of a resource, not public safety. Now, we may eventually come around to that, and we're not there yet, but there is no public safety issue, because as soon as we detect that there's something in the water, we do something about that. As soon as we detect that there's something in the food, we do something about it, and we have years and years of time to do that. So the real issue there is denial of a resource, and it's probably the reason why EPA is so adamant about the groundwater standard and protecting it now that they have a standard. If they didn't have a standard, maybe there would be a much more rational process developed here. But I think that -- I'm hopeful that when we get around the safety goals, these very activity-dependent issues are getting into account and the differences, and I see the problem centering principally around, still, number one, because I think for the most part, we can piggyback what we know already except for maybe the establishment of some sort of safety goal to use as a rule. So I'm hopeful that the thought processes here are taking into account such things as in risk assessment, the whole goal is to be realistic, and if you're realistic, the issue is not public safety on waste disposal. It's resource denial. And because you've got an opportunity to intervene, to interdict, you don't even call it emergency response, because it's not an emergency. It's something that happens very, very slowly, and you can detect it very, very, very early, and the opportunities are tremendous for corrective action. So I don't know if any of that kind of dialogue came out in this workshop, but I sometimes think we confuse the public because we present these issues sort of as if they're similar, and I'm talking about waste disposal versus reactor safety, and the differences are extreme. DR. FLACK: Well, no -- DR. GARRICK: And they give us a great opportunity to recast the problem and deal with it in a much different way. DR. FLACK: No, I think that's the reason why we have these five areas. That's not clearly that it would be similar to one of the others. DR. GARRICK: Yes. DR. FLACK: It definitely needs to be looked at in its own right, and I think that was one of the areas where it was identified as a specific area. But I think what the message is is that maybe it's more in the vein of public communication already what we have. We don't need anything new. We need to communicate what we have better and that it's really through the implementation, maybe, of some performance measures that we will know when things have deteriorated, and we can take those actions, and that's a defense-in-depth mechanism, actually. DR. GARRICK: That's an action that's no different from today. We don't -- we don't plant crops in contaminated fields today. We measure; we know what we're doing. We don't use a water supply that's polluted or contaminated, and I don't think we're thinking that way. And that's what we're talking about here, and that's all we have to do to keep it from being a safety issue. So I'm just throwing out this kind of notion that it's not a safety issue. It's an environmental impact issue, and that's all it is for number one. DR. FLACK: Yes. DR. GARRICK: And yet, number one is 90 percent plus of the reason why there is an ACNW or why there is anxiety in the nuclear industry about the waste and that we can't solve the waste problem. DR. FLACK: Well, yes, that may be true. Number two is not so far behind, though; I mean, there is a lot of concern in that area as far as transportation and so on. DR. GARRICK: It shouldn't be, though. DR. FLACK: Yes. DR. GARRICK: It shouldn't be, because we know how to analyze transportation, and we've done a tremendous amount of field work in transportation. We've just done a lousy job of making that field work available. We have run trains into walls; we've run trucks into trains, and we've damaged casks from every angle and perspective we possibly can. And the public does not know that. They have not seen that information, and they have not seen it organized and compiled and structured in a way that communicates what we do know about it. I think transportation is one of the biggest bogey mans that exist in this whole arena and that we as technologists, as agencies, have failed miserably in conveying what is known about transportation risks, and we're spending billions of dollars as a result of it, not only in the nuclear field but in other fields as well. So we, regulatory agencies and technologists have failed miserably in communicating to the public the transportation issue. DR. FLACK: Yes; I think there's an activity now to plain English issuance of that work that had been done on those transportation studies and the work that, I guess, continues to go on in demonstrating that the risk is small. DR. GARRICK: Yes. DR. FLACK: But I -- you know, I think what your concern is being appreciated by the staff and moving that direction, but, you know, maybe they should have been there earlier, but the plain English is supposed -- DR. GARRICK: No, what I'm trying to do is cut through some of the things here, and, you know, I sometimes think we treat these things as if they're equal when, in fact, the differences are extreme and that if we fixed one of these up there, you know, 95 percent of the problems would go away. DR. KRESS: I agree, and I think when you develop some sort of risk acceptance criteria or safety goal, and for the waste disposal area, I think it has to be an acceptable frequency of a release exceeding a certain amount of activity at a particular time. I think you have to have all three of those things: an acceptable frequency, a given amount of activity and at a given time. You have to discount 10,000 years from now to present cost some way if you have a cost criterion involved in it. So it does -- I think you're exactly right, that it impacts on how you -- what you say is an acceptable risk and what you define as your risk is the risk of exceeding a certain frequency of release at a given time. DR. HORNBARGER: Just for the record, I want to disagree with one thing that John said, and that is that although I agree with much of what he said about the differences, I think that one would have to be careful saying that it under no circumstances would be a public safety issue, and one of the comments he made was for example, we don't drink polluted water today. This is patently untrue if you look on a worldwide basis. One recent example that has made the news is the use of a very - - a water supply in Bangladesh by a very large number of people, and we know full well that it's contaminated with arsenic. DR. GARRICK: And my point there George -- DR. HORNBARGER: But there's no alternative. DR. GARRICK: My point there, and I've been to Bangladesh, and I know that. DR. HORNBARGER: Yes. DR. GARRICK: But my point there, though, is we have a choice. We have a choice to not drink that water. DR. HORNBARGER: Yes; the people in Bangladesh don't have much of a choice. DR. GARRICK: Well -- DR. HORNBARGER: That's my point. DR. GARRICK: We certainly would have a choice to not consume radiation contaminated water. DR. HORNBARGER: Well, I mean, it's easy to say that the people in Bangladesh don't have to consume that water. The point is they don't have an alternative. And your point is that people in Los Angeles, for example, can drink bottled water. That's fair enough. So it has to do with what you envision the wealth of the society being at the time, and that's very hard for us to say with certainty that thousands of years from now, people in Nevada would be able to drink bottled water. DR. GARRICK: Well, I know it's an extreme view, but it's a view I make to make a point, and the point is that the opportunity exists to prevent it from being a problem, whereas maybe if you had the right kind of reactor accident, you wouldn't have that kind of opportunity. That's really the only point. DR. FLACK: Okay; so, these, again, are the five areas that had come out as areas to be pursued possibly using case studies to better understand the regulatory process and to develop safety goals as part of that process. No other questions, I'll move ahead to the last slide I have, follow on activities. So we're still digesting all that has come out of the workshop, and we still have an open comment period, so we're not at the stage where we have a fully-developed plan, but that's where we're headed. As far as follow-on activities, we're going to inform the stakeholders and solicit comments by issuing a workshop summary with the transcripts and making those transcripts available to everyone. I believe we will be getting a copy for the committee as well. Draft a plan; that would be something that we're planning on doing within this fiscal year. We'll develop a plan, draft a plan and then, of course, interact with the group, the NMSS steering group and so other stakeholders as part of the development of that plan and then to inform the Commission and the ACRS, ACNW of the plan and then what the next steps will be. So, we're hoping to have that developed as part of this fiscal year. So, I would say we're in a stage now where we're about to be off and running. So, with that -- DR. GARRICK: Well, that's great. We may have time to hear from Mr. Bernero after all. DR. FLACK: Yes, great. DR. GARRICK: But let me, before I do that, make sure that the committee is -- doesn't have any pressing questions for John. [No response.] DR. GARRICK: No; thanks a lot. DR. FLACK: Okay; thank you. DR. GARRICK: It was very helpful. DR. FLACK: I appreciate it. DR. GARRICK: Bob? DR. BERNERO: I promise to be brief; nearly impossible. [Laughter.] DR. BERNERO: There is a two-page handout that I brought a bunch of copies of, and I hope you all have it. The handout, titled a process for risk-informed regulation of activities, is intended to amplify the recommendations I made at the workshop just discussed on April 25 and 26th, and just before I remark on what the handout says, I would ask you to recall that 20 years ago, when the reactor safety goals were being developed, there was an explicit statement in plain language of what the safety objective was, and it varied in expression, but it was essentially that the risk to those people who lived closest to a reactor would be negligible compared to the average risk of accidental death or cancer death, and that led to the hierarchy of one-tenth of one percent and, you know, what database you're going to use, and there's not that much sensitivity, because the Commission could have argued for 10 percent, for 1 percent, and it chose the very conservative value one-tenth of one percent, and I can attest from my own participation in it that it chose one-tenth of one percent because available risk assessments said yes, that can be met, that strict standard. It was holding the nose to the grindstone, really. The fundamental point that I try to make with this handout is that one needs to start with a plain language qualitative statement of the safety goals or objectives, and a word of warning: in the NMSS arena, one has to do this unique to the practice, a term often used in material regulation as a practice. That could be radiography; it could be bracha therapy; it could be low level waste disposal, but it's a practice that involved the disposition of radioactive material. I suggest in the handout a way to approach the statement of objectives for waste disposal, and one of the alternatives -- and I've used it before -- no person in future should suffer radiation exposure from these wastes that we would not find acceptable today. Now, that's an exposure statement. It could also be couched as a -- or couched in risk terms or supplemented by a risk statement, and more on that in a moment. I suggest that in each arena, there needs to be a plain language qualitative statement of objective. Then, one can develop measures of protection, still staying fairly close to qualitative language, not yet going into quantification, and I suggest, using the example of waste disposal, the three measures of protection. DR. APOSTOLAKIS: Before you go to that, Bob, you have another example in your plain language paragraph: releases from this isolated waste should do no harm to anyone. Doesn't that imply that there will be releases? DR. BERNERO: Oh, yes, definitely. DR. APOSTOLAKIS: And we know that that's a fact? DR. BERNERO: Oh, yes, yes. DR. APOSTOLAKIS: We're not trying to prevent those? DR. BERNERO: Yes; there's virtually no guarantee of total containment for the what? Maybe 20 half lives of all the isotopes. You know, we're quibbling about isotopes like iodine-129 with semi-infinite half-lives, you know, so 20 half-lives of iodine-129 is, I don't know, 300 million years or something like that. DR. APOSTOLAKIS: I still think that your second example is better, though. DR. BERNERO: Oh, I'm not laying claim to any of these, and I point out they don't have the risk terminology. But the three measures of protection that I think are expressing the idea I'm trying to develop; the first one is the likely, the expected or the predicted release results in exposures that are clearly acceptable. In other words, if you decommission a reactor and leave a pile of rubble that has detectable or measurable concentrations of radioactivity, you are looking for your best estimate, your most likely outcome to be clearly acceptable: no harm. You wouldn't do it otherwise. But then, you can go to a second, a risk statement, taking careful account of uncertainties in modeling, in data, in scenario development that taking those uncertainties into account, the estimates to the persons in the vicinity will be within the level of public exposure risk that we find acceptable today. Now, there's a little trick here: when you speak of the future exposure, you're forced to consider a predicted exposure or the risk of exposure, whereas what we find acceptable today is a measured as well as a predicted exposure, and that's an important difference. And then, lastly, the third measure of protection I would suggest is that go beyond the risk analysis; probe further for weaknesses, whether it's with an importance analysis or some other measure, because in waste disposal or in other material practices, there are factors that do not lend themselves to robust risk assessment. Radiography, for instance, I think you'll hear more about it later; it's so crucial to recognize that the individual radiographer is at the heart of the problem, and modeling the radiographer and his or her behavior is very difficult. So the handout goes on into quantitative measures, and the first thing I try to point out on the second page; you've heard from me before on a ladder of exposures, going from the high exposure down or from a low exposure up, and I list only four rungs on the ladder here, from 1 millirem to 1,000 millirem, and the point I'm trying to make is we glibly speak in risk assessment -- this is performance assessment for waste disposal -- of calculating a mean value, taking due account of the uncertainties in data, scenarios, models and so forth. But no one ever speaks of the uncertainty in the threshold of acceptability. There is no doubt in my mind people speak of 25 millirem or 15 millirem or 4 millirem as a threshold of acceptability as if to say below that number, I'm happy; I sleep well at night. If it goes above that number, I'm unacceptable; the situation is unacceptable. That's wrong, and all you have to do is climb the ladder one way or the other, and you see that. The standard is uncertain because of the habitual choice of very low, conservative numbers that make us talk about what I consider indistinguishable things like 25 millirem versus 15 millirem versus 4 millirem. To me, it's 10 millirem. There is no distinction over this order of magnitude. One should only speak in orders of magnitude. And so, if you do performance assessment as risk- informing, going back to the first three things, the first three measures of protection, I would see that as a set of findings of acceptability, not excluding the other deterministic findings like human intrusion being dealt with, things that don't lend themselves to risk assessment. But for the risk information of decisions, the findings can be something on the order of measure number one is the best estimate; measure number two is the mean value of a good performance assessment; and measure number three is the suspicious probing for weakness in the analysis, recognizing that there is a healthy difference -- I shouldn't say healthy -- a substantial difference between what we're stating as the standard of acceptability and what we might recognize as a standard of tolerability. DR. GARRICK: Is what you mean by number three the whole curve? DR. BERNERO: No, it's more than that; it's probing the process itself for, you know, I've always thought things that can bypass the event tree, that can render the systematic analysis weak. You should be treating in the, like, adequacy of site characterization; that's an uncertainty that should be treated in a good -- DR. GARRICK: So it's, in the language of George and Tom here, it's the unquantified uncertainty. DR. BERNERO: Yes; yes, the things that you really don't feel comfortable handling, and what I'm probing for here is some systematic method to probe for weakness and to look for the edge of the cliff, because the one thing you don't want to do is have an unquantified uncertainty that lurks and may likely give you catastrophic results. I have to disagree with you, John, on what you were saying earlier about waste disposal about if it leaks, we'll detect it, and we can interdict it. DR. GARRICK: I said that option exists. DR. BERNERO: Yes, yes, that option exists, and I would argument that in the plan language statement of no one in future will suffer exposure or risk or whatever the Commission or the regulator would choose to say that there has to be a statement of willingness to depend on stewardship or monitoring or even budgeting in the future. Right now, I spend a lot of my time looking at the Department of Energy's remedial action alternatives, and frankly, they're driven by the availability of funds and other factors that make it highly desirable that one could have passive protection. DR. GARRICK: Yes; well, we could debate this for a long time, and this is not the place to do that, but the point I was making is that it basically is no different than the challenge that the human race has of dealing with any environment. There's nothing peculiar about that environment. DR. BERNERO: Yes, and you're right in that, John, and it's just that in nuclear waste technology, as against contaminated waste technology or anything, there has been this espousal of an objective to do it passively without human intervention for whatever period of time and simply assessing at some interval -- 10,000 years or 500 years or whatever -- how successful has that been, or how successful would we predict that to have been. DR. HORNBARGER: Bob, just a comment. You used the 25, 15 and 4 millirems, and we all know that those numbers have some associations, and I would just point out that the 4 millirem, if it's interpreted as radionuclide specific, may, in fact, be one or two orders of magnitude lower than 15 to 25. DR. BERNERO: Yes, yes, if you use ICRP-2 and all of that, yes. The point I would make on waste disposal, my understanding -- I didn't participate or listen, but in the Maine Yankee decommissioning, I understand that there has been heated debate about whether 25 millirem is sufficiently protected vis-a-vis 15 millirem, and that, to me, is terminal bottom line disease. DR. GARRICK: Well, hasn't the state already put their limit on it of 10? DR. BERNERO: Oh, yes, most of the states do at 10. DR. GARRICK: Yes. DR. BERNERO: Yes; South Carolina, New York State, and I wasn't, you know, ruling out any. But the -- I think to quarrel about the level of protection provided by 10 versus 15 betrays that the argument is in the wrong forum, you know; the point has been missed. That's not risk- informed. DR. KRESS: Bob, I'm interested in your comment about comparing measured risk versus predicted risk as being comparing apples and oranges to some extent, and with regard, with respect to the high level waste repository, I would expect an acceptable risk 1,000 years from now ought to be much different than an acceptable risk tomorrow or, I mean, 10 years from now. How do you deal with that in acceptance criteria? How do you factor that kind of time consideration into a risk acceptance criteria? DR. BERNERO: It's factored in by the underlying assumption that there will be no change in the vulnerability of the human body to radiation exposure and also no significant change in the ability of medical science to cure cancer; that radiation induction of cancer and the relative fatality from cancer will not change. There has been substantial change in -- just in my career, I recognize, but the fundamental assumption is today's standards will be appropriate standards to judge the future. Now, as far as oversight is concerned for hazardous waste or for radioactives in hazardous waste at circla sites, the institutional mechanism is fix it to the appropriate standards, and for circla, come back every 5 years and look at it; for RCRA, come back every 30 years and look at it, and we'll talk about it then. That's a vastly different thing. The only place in radioactive waste that we encounter something like that is in the rather bizarre case of uranium mill tailings. If you ever get a chance to go look at it, uranium mill tailings are monitored annually after site closure with NRC oversight, and people go out there with shovels and tree planting to fix them. DR. GARRICK: Bob, it's our fault -- I realize it -- but your two minutes are up. [Laughter.] DR. BERNERO: Thank you. [Pause.] DR. RUBIN: Good morning. My name is Alan Rubin, and Bob is always a hard act to follow, but I'll do my best. I'm a section leader in the probabilistic risk analysis branch of the Office of Research, and I've had the pleasure and the opportunity to have a number of interactions with the ACRS before, but this is my first time to have some interactions with ACNW and the joint subcommittee. The subject I'll be presenting will be one of the applications of PRA for analyzing the risk from dry casks, and let me first mention that this is an effort that involves a number of participants both in the Office of Research and in the spent fuel project office of NMSS. DR. KRESS: Is this on site at reactor plants you're talking about? DR. RUBIN: This is on-site storage, dry cask storage. DR. KRESS: On-site storage. DR. RUBIN: Dry cask storage, yes; I'll get into the scope and the nature of the program as well, Tom. The participants, the many participants from the Office of Research are myself, Ed Roderick and Chris Rider, also in the probabilistic risk analysis branch; Ed Hackett in the materials and engineering branch and Charles Tinkler in the safety margins and systems analysis branch, and they are with us today. The time frame that we're in now in developing the plan, it's a very good opportunity for us to get feedback from this joint subcommittee on our approach to the plan, which I will be presenting today, so we welcome this opportunity to get your comments. For an outline of what I'll be going over, I'll first present the objective of the project itself. I'll go over the scope as we see it as well as the major planned tasks that are included in the program plan, which will include discussion of potential accident initiators; how we plan to screen those initiators; looking at various initiating event frequencies and sequence frequencies and consequence and risk quantification, and I will also discuss the present schedule and status of our plan right now and overall program. We heard presentations earlier this morning at a fairly high level, sometimes philosophical level on safety goals, and this time, we're going to get into a specific application of PRA, and the objective of this risk analysis is to do a pilot PRA for a specific spent fuel dry cask storage system at a reactor site, and this is a first of a kind, and it's got challenges being a first of a kind, which we have encountered; we need to overcome some of these challenges in carrying out the program. In terms of NMSS, what we expect to provide and NMSS hopes to get out of this program is to get information in several areas, to see whether there's a need to do additional site-specific PRAs; to see whether there's a need to develop any additional data or methods for doing PRAs for dry cask storage and to see whether some additional analysis would be required. And in the longer term, NMSS would like to be able to use this information to provide input to the safety goal assessment; to risk-informing 10 CFR Part 72 as well as for the inspection programs for dry casks. [Pause.] DR. RUBIN: The participation in the program involves all three divisions in research, and it's a team effort: division of risk analysis and applications has the lead for the systematic analysis and integration of the PRA as well as coming up with frequency and probabilistic assessments; division of engineering technology will participate in coming up with analyses and engineering assessments of the materials of the multipurpose canister and the cask as well as the overpack structure, and I'll give a figure later on, a diagram to tell you a little bit more about the cask itself and develop some thermal analysis. Division of systems analysis and regulatory effectiveness will provide assessments of the radiological release and dose assessment to the public in the event of an accident. It's very important that we coordinate this program very closely with the spent fuel project office in NMSS, and we've done so up to this point, and we expect to continue to do so. This is in terms of both developing the program plan itself as well as carrying out the plan. We also anticipate that we may need some information from licensees to provide dry cask design or operational data, in effect, perhaps some analysis that supports some of their information, their safety evaluation report. We expect that we will also need some contractor support in the area of human reliability analysis for -- to address the handling and transport aspects of transporting the fuel and the cask itself and perhaps other additional contractor support as well. DR. APOSTOLAKIS: So you can do everything else; start your thermal analysis, everything except human reliability analysis? DR. RUBIN: Well, we know that we may need some analysis as well. It's going to depend on what's available from the contractors, from the -- not from the contractor; from the analysis already done for design basis accidents analyses by the vendor, by the licensee, and see whether we can extrapolate that or whether we need to do some additional analyses, and we're wrestling through right now to see whether we can do that in-house or we'll need contractor support. DR. GARRICK: Did you say that some of the same team members are involved as were on the spent fuel pool? DR. RUBIN: Yes; that is correct; same names, same guilty parties. DR. GARRICK: Yes. DR. RUBIN: That's the team. Let me go over what's entailed in the scope of the project, and in the recommendation from NMSS, we select the whole tech high storm 100s dry cask for analysis for this pilot PRA, and that's based in the potential usage of this dark cask as well as availability of data that's been submitted in the licensee's application. It's the different modes of the analysis, the analysis itself will include handling of the fuel and the dry cask; onsite transport, transport of the cask to its storage pad as well as the long-term storage onsite. It does not go beyond that; it does not include transportation PRA, which has been done separately, separate risk assessment for that. We will need to select a site, and most likely, it will be a generic site such as we've done for the AP600 reactor design, which should encompass as large as possible that number of site characteristics in the country. We don't include all of them, but a large percentage of the various characteristics of the site. The types of events that we will consider include normal and accident conditions such as -- including design basis and beyond design basis accidents; site-related phenomena such as earthquakes, flooding, high winds as well as man-made incidents that can occur during a handling of the cask like cask drops or other handling accidents. We also will look at the condition of the fuel in the canister itself in the cask; whether there are some preexisting fuel failures; the condition of the clad, which will factor into the assessment in terms of consequences and risk from radiological release from the fuel and from the cask. And a real important point that's part of the plan is to say how are we going to assess the results? You know, what is the criteria? We talked about that this morning and developed -- and NMSS developing safety goals, whether it be the probability of release of radioactive material to the environment or radiation-induced latent cancer fatalities to the public, and we expect to work very closely in NMSS to provide input to specify these measures of success related to the safety goals that you've heard discussed this morning. That's a broad picture of what the scope of the program, the PRA, will entail. For some background; I think this sketch will be useful just to see what we're talking about. DR. GARRICK: I may have missed something, Al. DR. RUBIN: Okay. DR. GARRICK: Are you calculating radiation- induced latent cancer fatalities? DR. RUBIN: If the release aspects include that, we may be calculating that. DR. GARRICK: Okay. DR. RUBIN: Yes. DR. GARRICK: All right. DR. RUBIN: Depending on the release, and we may get into the meteorology and -- DR. GARRICK: I see. DR. RUBIN: Yes. DR. GARRICK: So it depends on your source term. DR. RUBIN: That's right. DR. GARRICK: Yes; okay. DR. RUBIN: Yes. First, the fuel itself is put into this multipurpose canister. The fuel assemblies are inserted in that canister. It holds up to 25 BWR assemblies -- 24 PWR assemblies or 64 -- 68 BWR assemblies. The canister is drained of water and pressurized and filled with helium. The purposes are to prevent corrosion of the steel on the multipurpose canister itself as well as to enhance heat transfer to the canister. When the canister is inserted, the overpack itself -- DR. APOSTOLAKIS: Why is the called multipurpose? DR. RUBIN: It's just the name given. It's multipurpose because it can be used for storage as well as transport. The overpack is a large part of the structure. That's approximately a 19-foot high structure, 11-foot outside diameter. It's got an annulus of steel both inside and outside that diameters, and it's filled with about 2.5 feet thick concrete, and the overall weight of this structure when filled is about 180 tons. In order to promote cooling in the long-term during storage, there are inlet and outlet events, four of them around the periphery of the multipurpose -- of the overpack itself. So it's all natural circulation vented system; no fans, no pumps, no active systems. It seems like a relatively straightforward PRA kind of analysis compared to a reactor. But it's -- even though it seems straightforward, this is the first time that it's being done. And the first step in the program plan is to identify the potential accident initiators, and that was brought up this morning as an important point. We want to be sure we have as complete as possible set. We've had preliminary discussions with the spent fuel project office staff to identify issues that they have thought about, and we'll be adding some other issues, potential challenges to the functions of the cask leading to a potential release. And I'll get into some of these initiators in my next slide. We'll assess the initiators that have been identified in other related studies or PRAs and see if they're applicable to the dark cask, and we'll include those as appropriate. And it comes down to pretty much categorizing the many initiators into their effect on the mechanical impact of the cask; the thermal impact and impact on criticality and perhaps some others, and for each of those accident type initiators, we'll address the various system modes, the handling, the onsite transport of the cask and storage for the 20-year license of the cask itself. To go through these preliminary initiating events that we have looked at, I first want to make it very clear that many of these events are included in design basis for the dry cask, but we need to look for PRA purposes beyond design basis types of events, and that's why we're including the kinds of events that you see here. First of all, in the categorization of mechanical impacts, we'll be looking at handling accidents where the cask could be dropped either before or after sealing of the canister, and I should have mentioned -- let me go back for a moment -- the canister itself. The canister is sealed, and there is an overpack seal for the whole dry cask system itself. So we will be looking at potential weld failures of the seal on the multipurpose canisters as an example of a failure or release path. As an example for the drop of a cask, the current design basis is the cask is designed for a drop height of 11 inches, right about a foot. So are there some ways or some scenarios where, you know, that height by human error or something else could be exceeded? And those are the kinds of things we'll be looking at in the human reliability analysis area. In transferring the cask, that the cask potentially could tip from sudden movements or stop during onsite transport; in the storage of the cask, the long-term storage, could the cask be hit by a tornado-generated missile large enough to impact the cask, knocking it over or impact by a truck or an aircraft accident? Could the cask tip over from a seismic event? That's not mentioned on the slide, but it is included in our list of events that we're looking at. [Pause.] DR. HORNBARGER: So your tornado scenario is the cask tipping over rather than being penetrated? DR. RUBIN: Yes, yes; I imagine it would be hard to penetrate a steel barrier with two and a half foot thick concrete. So these external events that we're looking at, by the way, are very similar and parallel to that of a reactor analysis in an internal, individual plant examination of external events program, the IPEEE program. DR. HORNBARGER: Of course, there are reports in tornadoes of a piece of straw being driven through concrete walls. DR. RUBIN: And it would have to -- we have not considered the straw impact on a concrete wall. We are looking for some feedback during this discussion today. Thermal air accidents is another area; obviously, as I pointed out, the cask has a venting for heat removal. Could there be some scenarios where the high heat load assemblies are inadvertently loaded into the cask, having higher heat loads that would be anticipated? In transferring the cask onsite, looking at a fire from ignited fuel from the transport truck. DR. APOSTOLAKIS: I don't understand that. Can you explain that scenario a little bit better? DR. RUBIN: Which one? The transfer? DR. APOSTOLAKIS: The transfer, yes. DR. RUBIN: The cask is transferred by vehicle to the storage pad, and if that vehicle has an accident, and it's got a certain amount of fuel in it, what kind of thermal loads from the truck -- DR. APOSTOLAKIS: Oh, the fuel. DR. RUBIN: The truck, the fuel. DR. APOSTOLAKIS: Oh. DR. RUBIN: The gasoline -- DR. APOSTOLAKIS: Okay. DR. RUBIN: -- in the truck, okay? All right; and long-term storage, vent blockage, perhaps, from flooding, long-term flooding, for example, how long that could occur and what temperature conditions would then result in the fuel in the assemblies, and what impact would that have on the integrity of the fuel in the canister? Look at high ambient temperatures associated with a fire; looking at perhaps a fire associated with a crash of an aircraft. And then, finally, we're including criticality events, where, from handling -- the possibility of highly- enriched fuel being loaded in a cask or water ingression from a flood with a failure of the overpack and a failure of the multipurpose canister; flooding and causing criticality, perhaps. There's a fairly long list, and there are some more details that I'm not going into today, but a real important part is that we expect in our next step to do a screening. DR. APOSTOLAKIS: Your event trees will be fairly simple. DR. RUBIN: Oh, yes, oh, yes, and that's where we're looking at four types of entries, many types -- you know, there could be many possible ways that you could impact the thermal loading by blocking the vents, anywhere from flooding to a bird's nest that's not found for awhile, but they basically are all in the same event trees; that's right. We're boiling down a lot of these different types of sequences into a short, limited number of event trees. And a real important part in the first phase of this program is the preliminary screening, the consequence analysis. The purpose of this is to eliminate any inconsequential either initiating events from further consideration based on the sequence frequencies or the release magnitude, and the release magnitude would be based on the extent of cask failure, and the purpose of this screening study also is to see whether we would need more information for some additional, more detailed analysis. And I'll get into this screening study a little more in the next slide. DR. GARRICK: Do you have a sense for which scenario is going to be your bounding scenario? DR. RUBIN: I mean, my gut feel is that, you know, you have a passive cask sitting there that we really need to focus on handling the human aspects of the transport. We don't have a bounding scenario right now, but I think that scenario where there has not been much done on the human reliability of the cask handling, and we intend to look at that pretty closely. DR. GARRICK: Thank you. DR. RUBIN: As far as the steps going for the screening and preliminary consequence analysis, we are identifying and -- the information available to come up with the initiating events and the event trees and fault trees. We'll assess the sequence frequency and the consequences on a limited basis for screening purposes, and we would eliminate any significant sequences from further consideration. There are basically several ways to deal with this. MR. RUBIN: There is basically several ways to deal with this, that a given initiating event, the cask doesn't fail or the canister doesn't fail, and a third screening approach would be, even if there were some kind of mechanical failure, would the release have an adverse impact on the public? Clearly, there's data, there's methods going into here. We'll have some uncertainty and sensitivity analysis that we'd expect would be part of this screening study, and as we go along through this, particularly since this is a first of a kind, we expect to have interactions and peer review and comments as we go along, and it could be that, once we're finished with the screening study, we may have eliminated a lot of scenarios. DR. KRESS: All of them? MR. RUBIN: I can't tell you now. Stay tuned. If I know that -- DR. KRESS: You're starting out with some sort of a measure of what release would be acceptable to you? MR. RUBIN: That's something that, as I mentioned earlier, we're going to need to determine with NMSS what measure of release. Do we use the same kind of measure that's used for reactors? Given that there is nothing better, we may try that, but I think that's yet to be determined. If there are some sequences or events that don't screen, we would -- the next step would be a more detailed frequency quantification for those sequences, fairly straightforward looking, first to see what kind of data are available, benefits of cost of getting that data, refining the event trees and fault trees and computing sequence frequencies and sensitivity studies as necessary. The purposes for this more detailed analysis would be to look at the radiological consequences, the calculations for those important sequences that don't screen, determine the releases from the cask, the off-site consequences, as well as the risk calculations. DR. APOSTOLAKIS: You will develop frequency consequence curves? MR. RUBIN: Yes. Come up with overall risk, yes. DR. APOSTOLAKIS: What's DCS, by the way? MR. RUBIN: Dry cask system. DR. APOSTOLAKIS: Okay. DR. KRESS: You're going to do this on a cask basis or you're going to look at the whole -- DR. APOSTOLAKIS: The size of casks. DR. KRESS: -- storage? MR. RUBIN: We're looking at a site where there may be on the orders of tens, fifty, at probably maximum 100 casks at a site. DR. KRESS: They're all going to undergo the accident at the same time. MR. RUBIN: Well, not necessarily, no. We're treating it individually, but if there is some -- a sequence that would be a common mode failure, like a seismic event tipping over casks, then it would impact more than one -- potentially could impact more than one cask. DR. GARRICK: But if your screening indicates that there is only one scenario that really can result in any kind of a consequence problem, does that mean that's what you'll analyze, or will you still analyze all the scenarios? MR. RUBIN: No, if we can screen them out, we will not continue, no. Our attempt is to get some results as early as we can, you know, with some reasonable information, and not do more than is necessary. MR. HORN: What happens if they all screen out? MR. RUBIN: I'd say, then, the next phase doesn't -- we stop right there. We wouldn't have a need for more detailed analysis. But let me go on. You're leading into my next slide, which is potential additional analyses or assessments which may be a followon to this first pilot PRA, looking at various fuel conditions such as high burn-up fuel, the number of casks at a site, there's a proposal for a private spent fuel storage facility in Utah, and looking at various cask -- different designs. That's something that could follow on to this. We're looking at one specific design for this, the Holtec High Storm. Other designs have different features, and we want to be able -- maybe want to be able to see how that could impact our screening analyses, for example. The final slide is our schedule and the status. Where we stand right now -- we've developed a draft program plan in Research that we've discussed with NMSS, and we're in the process of getting ready to send that formally to NMSS. The project scope, identifying the site characteristics, the design, initiating events, we intend to have done in the August timeframe, and we would hope that, depending on the available resources and data, to have a draft report available in about a year from now on the screening assessment, and that depends on the availability of data and what kind of information we need as we go along, and then any followon, additional sequence calculations would need to be determined following the conclusion of the screening part of this project. That hopefully will give you an overview of what our plans are and where we're going. DR. GARRICK: Are you finding anything unique here with respect to methodology requirements? In other words, are at a loss for methods for any part of the analysis? MR. RUBIN: I think what we're focusing on -- we want to have a complete set of the sequences. I'd say no, I don't think we've met that through the problem, the area, but we want to include the sequences and the data. If there's not data, what are we going to do? DR. GARRICK: Yeah. MR. RUBIN: Are we going to do some analyses? We're going to base our assumption on some extrapolation of analyses that have already been done for design basis-type events? That's what I see as probably the most resource- intensive, as well as the human reliability aspects. DR. KRESS: I think you'll be faced with some of the same issues pointed out with the spent fuel pool study, what's the source term, how you deal with the fire-driven event, and what are the things you don't know about and the effects of hydrided clad. I think the issues are the same for the cask, plus a few more, and that's what do you know about long-term deterioration of the cask that may cause internal corrosion events like -- driven by moisture and maybe hydrogen that gets produced in the process. MR. RUBIN: I should mention at this time, in response to that, there are some ongoing related activities that Research has in support of NMSS and dry cask. Now, the results of those programs -- probably most of them will not be available in time for the screening study, but for example, the -- looking at -- I think it's the Surry cask, fuel in the Surry cask, looking at the condition of that fuel, they took an initial, the fuel cladding looks fairly good, but they haven't done analyses in the fuel cell yet. DR. KRESS: The problem with the screen that I see is how do you determine the initiating frequency of a fire that's a self-igniting fire? MR. RUBIN: A self-igniting fire is okay. DR. KRESS: You know, I see how you can get more of a fuel spill from a truck. You could probably come up with the frequency of that, but a self-ignited fire that may be driven by the -- by a change in the heat transfer properties of the system so that it overheats to some emission temperature that's driven by the hydrided state of the fuel -- you know, how do you get a frequency? MR. RUBIN: Well, you have an inert atmosphere. You have helium. It's a helium-filled canister. So, you would have to have a sequence where the helium has escaped, you haven't detected it for a long time, and there are inspections that go on on storage to look for those kinds of things, to look to see if the vents are blocked or not. DR. KRESS: It may be driven by the fact that you have that helium. MR. RUBIN: Oh, yes. Oh, yeah. DR. KRESS: Could very well be. MR. HORN: Just for my education, you mentioned that the cask drop was 11 inches, but the multi-purpose canister is tested at a higher elevation, is it not? MR. RUBIN: Oh, yes, it is. That's about three or four feet. The transfer cask -- MR. HORN: Yeah. MR. RUBIN: Its drop rate is about, I think, 40- some-odd inches. When I said the 11 inches, that was for the whole over-pack. MR. HORN: There's also a design basis fire test? MR. RUBIN: Yes, there is, and it's the amount of fuel that's in the truck, and the question is if there's more than that -- those kinds of things are in the design basis, which we already -- you know, that information has been provided, at least the results of those analyses have been provided to the agency. DR. GARRICK: As long as it's not located next to the fuel tanks for the diesel generators, huh? MR. RUBIN: A lot of what-if's. DR. GARRICK: All right. Any other questions? [No response.] DR. GARRICK: We'll enjoy hearing a progress report as you start. MR. RUBIN: Okay. We intend to keep the joint subcommittee informed. DR. GARRICK: Okay. DR. KRESS: Given the status and progress of Yucca Mountain, I think this is an important study, because we're probably going to have a lot of dry cask storage on-site. MR. RUBIN: And if we need to go into the -- you know, the more detailed sequence analysis, that's going to take an extended -- probably an extended period of time. DR. GARRICK: Thank you. Thank you very much. MR. RUBIN: You're welcome. DR. GARRICK: It's a remarkable event, but we're on schedule. DR. KRESS: You run a lot tighter meeting than George does. DR. GARRICK: George has a lot more patience than I have. Okay. Why don't we adjourn for lunch, then, unless -- and we're going to reconvene here and discuss protocol as soon as we grab a sandwich and bring it back here, 12:15. [Whereupon, at 11:55 a.m., the meeting was recessed, to reconvene at 1:00 p.m., this same day.] . A F T E R N O O N S E S S I O N [1:00 p.m.] DR. GARRICK: The meeting will come to order, and this afternoon we're going to talk about risk-informing fuel cycle programs, etcetera, etcetera, and Mr. Sherr is going to take the lead on it, I gather. MR. SHERR: I'm going to give a quick overview on the background of the fuel cycle programs for risk-informing the regulations, and Dennis Damon, who is now with the NMSS work group but until recently has been in FCSS and been an integral part of the work that we've been doing in this area, will be providing more detailed information. DR. APOSTOLAKIS: Now, what is the definition -- in the documents I have, there was a very long paragraph where one can find the definition of byproduct material, but the definition itself was not given. DR. GARRICK: It's in the Atomic Energy Act. DR. APOSTOLAKIS: I have to go back to that. It gave me all the paragraphs, but special nuclear materials are uranium and plutonium? MR. SHERR: Enriched uranium. DR. APOSTOLAKIS: Enriched uranium and plutonium. MR. SHERR: Right. DR. APOSTOLAKIS: Okay. MR. SHERR: And other materials as the Commission may determine, which they haven't determined any so far, since 1954. The major activity that's going on in terms of risk-informing activities in the fuel cycle area -- and I guess maybe it's worthwhile to just step back a second, when we say fuel cycle area, what we mean. The regulations and their development would apply to essentially the existing fuel cycle fuel fabrication facilities. They will also apply to the plutonium mixed oxide facility that's currently planned, and we expect to be receiving a license application before too long. So, those are the facilities that are immediately expected to be subject to this regulation. This rulemaking has been going on for quite a long time. It was initiated at the Commission's request in 1993, and we are nearing the final stages. The final rule package is due to the Commission within the next couple of weeks, May 15th. In fact, when I leave here, I'm going to go back and try to get the package out of the office. In parallel with this effort is an effort that's directed to revising the oversight program, similar to what has been done in NRR for the reactor area, and this program is also directed to make a more risk-informed perspective in terms of the inspection program. First slide. There's two events that occurred that affect the - - significantly the current rulemaking. One was in 1986, the Sequoyah Fuels accident, and that didn't directly affect this rulemaking other than the fact that it raised the issue in terms of is NRC responsible for chemical safety as well as radiological safety, and there was a lot of discussion about this, congressional hearings, and the result of all that was a memorandum of understanding between the Nuclear Regulatory Commission and OSHA which limited NRC's responsibilities in terms of responsibility for chemical consequences and protecting against them but, at the same time, identified certain responsibilities. This rulemaking addresses those responsibilities, and later on, when Dennis is talking about the performance requirements of the rule, you'll see how it does that. DR. GARRICK: It should be pointed out, of course, that this UF-6 release fatality was not a radiation fatality. MR. SHERR: No, that's right, and that was the controversy at the time, which agency should have done something about this type thing, and there was a lot of different views on that and a lot of pointing different directions. In 1991, there was a near criticality incident at one of the fuel fabrication facilities, and following that, there was a significant review that was conducted, and a lot of problems were identified with the way safety programs were being implemented at facilities. It wasn't that they were inadequate, but one didn't have the confidence that they were always going to be maintained. Basically, the fuel cycle licensing approach was done on a renewal-to-renewal basis, and whatever changes happened to the safety program between renewals was outside the oversight of NRC, and how well there was control in between time varied depending on the specific circumstances, and the particular situation at the facility where they had the near criticality incident, there was a case where controls were changed, and so, there was a clear need for better configuration management. As I said, in 1993, the Commission -- after a few other things were done, the Commission said it's time to pursue a rulemaking at essentially -- they didn't use these terms in those days but essentially with a focus on the conduct of an integrated safety analysis which had as part of that -- as I say, they didn't use this terminology, but risk-informed, performance-based rule. DR. APOSTOLAKIS: Now, at that time, I guess, the terminology was different, too, because today we would not say increased confidence in margin of safety, we would simply want some risk acceptance criteria, wouldn't we, Tom? That's really what it means. If you are quantifying risk, you want some risk goals. MR. SHERR: Yeah. DR. APOSTOLAKIS: Yeah. MR. SHERR: Yeah. And that's exactly what the rule does, and that will be a significant part of the presentation as you -- if you've looked ahead in any of the view-graphs, we'll be discussing those risk goals. But the integrated safety analysis is essentially a systematic review of the hazards and a means for identifying controls. I brought with me the definition. I'm not sure it's in any of the material that you have, and it's defined in the proposed rule as a systematic analysis to identify facility and external hazards and their potential for initiating accident sequences, the potential accident sequences, their likelihood and consequences, and the items relied on for safety. So, basically, it's a systematic safety analysis where it simultaneously considers the radiological nuclear criticality, fire, and chemical safety hazards. DR. GARRICK: What is that you're defining? MR. SHERR: Integrated safety analysis. That's a inherent part of the whole rule and the focus. From the -- just to give a little -- I'm not sure how familiar you are with the fuel cycle industry as compared to the reactor part of the industry, but it's worthwhile to note two significant differences between the fuel cycle and the reactor area. One is that the fuel cycle facilities have diverse processes. The equipment varies from facility to facility. It's not where, if you've gone to one reactor of one type, it's pretty much the same as the next one in terms of general equivalence-type thing. The other thing is that it's a less contained environment. There are a lot more administrative actions and, accordingly, administrative controls that are involved as compared to engineering controls, and these things affect both the database that might be available in terms of equipment reliability, as well as the ability to quantify the effectiveness of controls, the administrative controls. Next slide, please. In terms of the major elements of the rule, as we mentioned, the focal point of the rule is the requirement for licensees to conduct an integrated safety analysis, which we referred to as ISA. As I mentioned, the integrated safety analysis essentially identifies the accident sequences of concern, and for each one of those accident sequences, identifies the items relied on for safety that will either prevent that accident from happening or to sufficiently mitigate its consequences so it is reduced in terms of the level of concern. DR. APOSTOLAKIS: So, how is the ISA different from a PRA? MR. SHERR: Well, I think, in overall concept, it's the same, and in fact, PRA -- a PRA is one example of a methodology that could be used, and using event trees is certainly what we would expect for complex processes, and I think the biggest difference -- and this will be an issue that Dennis will be talking about later -- has to do with the degree of quantification that we expect from the process. DR. APOSTOLAKIS: In order to avoid the proliferation of terms, why can't we just call it, then, a PRA and then define different types of scope where in some instances perhaps you don't want to go into detailed quantification and in others you do? I mean we already have level one, two, and three for reactors which are PRAs of different scope. One stops at the core damage event, the other proceeds to containment, and the third one is, you know, the full PRA with risk estimates. MR. SHERR: Yeah. DR. APOSTOLAKIS: I think it would be important to harmonize terminology, don't you think? MR. SHERR: Well, this terminology has been used in the fuel cycle area now for quite a number of years. A number of licensees already have license conditions to be conducting ISAs. They're not geared to any particular performance standards that the rule would establish, but that's an open question, I guess. DR. GARRICK: What you're really saying is that the ISA can be -- can embrace either a deterministic approach or a probabilistic approach. Is that right? MR. SHERR: Well, it can use quantifiable ways of assessing likelihood or less quantifiable ways of assessing likelihood, and as I was saying, Dennis is going to be addressing that particular aspect, and I think that's the major difficulty in dealing with the fuel cycle facilities, is that it's much more difficult to do a quantified safety analysis. DR. APOSTOLAKIS: But then if you don't do that, how can you demonstrate that you have increased confidence in the margin of safety? What metric would you use for the margin of safety, or will it be qualitative and say, gee, I have an extra barrier, so I have increased confidence? Is that really what it is? MR. SHERR: Dennis? We're skipping into the more detailed part of the presentation. MR. DAMON: Well, my own impression about the increased margin of confidence and safety is not really the assessment of risk. It is the fact that the knowledge both of the plant staff and of the NRC staff as to what actually you have in the way of a safety design was not very complete, not well documented, not analyzed systematically. Consequently, if you ask somebody the question, what is the risk of this facility, they say, well, I think it's okay, but I'm not very confident. It's the second order of uncertainty. The word "uncertainty" has been mentioned many times here. It's the uncertainty level that was high. DR. GARRICK: It sounds like what you're attempting to do here is to provide more flexibility in the rule than you think would be provided if you used the words "PRA." MR. SHERR: That's true. MR. DAMON: Right. Because the rule, as you will see, is mandating things. It's not like PRA has been used elsewhere as a form of information for guidance. It's a requirement that they do certain things. DR. GARRICK: Clearly, a PRA would be an acceptable ISA. MR. DAMON: Yes. MR. SHERR: Clearly. DR. GARRICK: I'm with George. I don't quite understand why we went in that direction, but I'm not sure wed can do much about that at this point, and I am sympathetic a little bit, because this evolved with time, before probabilistic analyses was really a part of the process. DR. APOSTOLAKIS: That reminds of the individual plant examination situation, where, when the generic letter was published in 1988, they deliberately avoided the word "PRA," because some people were arguing that you could do this, you could identify the vulnerabilities using other ways, other methodologies. Now, six, seven years later, we got all the IPEs, and there wasn't a single one that did not use PRA. So, the reality of it was that, really, the generic letter asked for a PRA, at least a level one PRA. Is that correct, Tom? DR. KRESS: Yes. DR. APOSTOLAKIS: Your impression, too? So, why perpetuate these things? There is some fear, I guess, when it comes to PRA, that people will have to produce an 11-volume document with all the details in reactors. I mean if you don't have a system that has the highly redundant and diverse systems of a nuclear power plant, your PRA will be much simpler, but it will be a PRA. I mean the reason why those event trees go around the room is that, in reactors, you have all these, you know, redundancies and opportunities for operators to intervene and do things and so on. If you didn't have those, then maybe one page would be enough. But it's a matter of scope. You know, I appreciate the fact that, you know, these words have been used already in the regulations, but at some point we have to start creating some harmony. DR. GARRICK: What I guess you're telling us is that integrated safety analysis is more of a process than a prescriptive analytical activity. You don't -- we'd get real concerned if it was so prescriptive that it was something very different from a PRA or precluded a PRA being an acceptable form or an acceptable interpretation of an integrated safety analysis, but it sounds like you've accommodated that. DR. APOSTOLAKIS: In some instances, the way I understand it, the ISA would allow you not to calculate the consequences, just scenarios, and in other instances, it would not even ask you to produce probabilities, but in other words, you take the complete triplet, and instead of subtracting things, for some reason, which remains to be determined -- DR. GARRICK: That's why they call it safety and not risk. DR. APOSTOLAKIS: Yeah. But the idea is there. It's the same thing, really. DR. GARRICK: Yeah. MR. DAMON: Part of the reason why there's a difference in terminology is historical, the methodology of all -- not from NRC groups but from the chemical industry. DR. GARRICK: Right. We understand that. DR. APOSTOLAKIS: But even when they borrowed our PRA, they called it QRA. MR. HORN: Maybe we could introduce that. DR. APOSTOLAKIS: Yeah. MR. SHERR: The ISA guidance document that has been developed weighs heavily on the chemical industry guidance document for hazards analysis, which, as you say, includes a broad spectrum of specific approaches. DR. APOSTOLAKIS: But again, you know, let's not be in awe of that. The truth of the matter is we're ahead of them when it comes to safety issues. I mean that's the truth. We are actually quantifying risk. So, the fact that the chemical industry is doing that doesn't mean anything to me. DR. GARRICK: I think we ought to hear out their story, and I have some of the same anxieties, but I believe history has put us where we are, and to try to undo it would be quite difficult. DR. APOSTOLAKIS: I appreciate your position. I'm just saying that maybe it's time we started thinking -- DR. GARRICK: I'd feel much better if, everywhere I see ISA, I can put PRA. DR. APOSTOLAKIS: Yeah, as long as we understand that the scope may be different depending on the situation. DR. GARRICK: Because it's not risk-informed if we don't deal with the triplet, as George says. DR. APOSTOLAKIS: That's right. DR. GARRICK: Okay. MR. SHERR: So, the ISA identifies the basic controls that are either going to prevent or mitigate -- prevent the accidents or mitigate their consequences, and in addition, the rule requires the facilities to maintain management measures that ensure that the items relied on for safety are, in fact -- will be, in fact, available and reliable, and essentially, the bottom line of the rule says that the results of your ISAs has to demonstrate that the performance to the requirements of the rule be satisfied. DR. APOSTOLAKIS: So, let's see now what that means. Are you going to talk about it later? MR. SHERR: Yes. DR. APOSTOLAKIS: Okay. MR. SHERR: Very shortly. DR. APOSTOLAKIS: It's too prescriptive. DR. GARRICK: Well, one of the things I notice in the material that we were supplied, which was the rule, etcetera, it says that -- and maybe we don't need to go through all of this -- there are four major steps in performing an ISA, and to deal with one of the questions that George raised, step number three says determine the consequences of each accident that has been identified for an accident with consequences at a high or intermediate level, as defined in the regulation, the likelihood of such an accident must be shown to be commensurate with the consequences as required in 10 CFR 70.61. So, you've got it all mixed in here, which even makes more for the case of why ISA, why not just PRA, but anyway, I think we understand where you're going. MR. SHERR: Well, I think the important thing is - - I mean without regard to what we've called the terminology -- is that the performance requirements are in terms of risk. They're saying the risk needs to be limited, and as Dennis will be going into more detail, we've identified two categories, high consequence and intermediate consequence events, and essentially the requirements for high consequence events -- they have to be highly unlikely, and intermediate consequence is unlikely, and those terms are not defined in the regulation, they're discussed in the SRP, and again, that's part of the detail. DR. KRESS: What do you do with the consequence events that are in between those two? MR. SHERR: That are below those? DR. KRESS: Fifty rems. MR. SHERR: They're mutually exclusive. DR. KRESS: Oh, I see. MR. SHERR: But it's just a question of what's below those levels, and that's treated as part of Part 20 requirements. Part 20 still comes into play. This is only dealing with the accident. DR. APOSTOLAKIS: What's the limit for workers at reactors? Is it 100 rem? It's 25, isn't it? Five rem. So, why is this 100? Let's go back to the -- MR. SHERR: Actually, Dennis skipped ahead here. DR. APOSTOLAKIS: Yeah. Dennis doesn't know us very well. Okay. So, for workers, it's 100 rem or more. I don't understand that. Must be highly unlikely. MR. DAMON: These are accident risks. I mean the 5 rem is an occupational dose. DR. APOSTOLAKIS: Oh, so, this is accident. MR. DAMON: These are accidents. DR. APOSTOLAKIS: So, you are requiring, then, accidents that lead to this dose to be highly unlikely. Okay. DR. KRESS: Now, suppose I have an accident that's projected to cause 1,000 rems. Is my definition of highly unlikely the same as for 100 rems? MR. SHERR: Theoretically, yes. DR. KRESS: That seems a little strange to me. DR. GARRICK: It's highly, highly unlikely. DR. KRESS: I would want some gradation in that. MR. DAMON: That's a good question which I don't think could be answered outside of a court of law. What I did in the Standard Review Plan when it came to that exact point is I said, because that upper category is open-ended on the upper sign, that whatever guideline -- the guidelines that were developed in the Standard Review Plan as to how to judge whether something is highly unlikely or not were addressed to the typical type of accident that would occur in that group, but it made a warning statement that if your accident is substantially above or below this typical case, then the likelihood has to be scaled accordingly. DR. KRESS: So, you did deal with that as qualitative. MR. DAMON: Yes. So, I'm claiming, as a member of the staff, that I can interpret that in a flexible way. Now, whether that would stand up, you know, I don't know. DR. KRESS: That looks like a bit of a problem to me. MR. DAMON: It was considered early on as to whether to put additional categories, but then you get into the thing and you'd end up having some kind of complementary cumulative distribution in the rule or something. DR. KRESS: Well, I'm not so sure that's a bad idea. You know, that might be a good way to cover the whole spectrum. DR. APOSTOLAKIS: So, what's the third category? Is there a category -- we have highly unlikely, unlikely, and expected? MR. DAMON: What Ted was saying is, if you look at the criteria for unlikely, you could be below that. You could have an accident that produced effects below that, and that accident is not addressed by the requirement that's stated here, it's just left, and that's actually a risk- informed aspect of this. There was a deliberate decision that accidents at that level weren't worth the effort to impose a requirement to analyze. DR. APOSTOLAKIS: What is the dose limit for routine activities? Is that five? For workers. Okay. MR. DAMON: So, there is a window there. MR. MARKLEY: George, this is that special planned exposure category that I was telling you about this morning, 25 rem or better. MR. SHERR: The last overview point was just the fact that, when this rule goes into effect, the completion of the integrated safety analyses and implementation of the controls at the facilities to satisfy the performance requirements will need to be implemented within four years of the rule is published, which we're hoping will be late this summer. And with that, I'm going to let Dennis then go into the more detailed parts of the performance requirements. MR. DAMON: My name is Dennis Damon. As Ted mentioned, I, up until recently, worked for him in this area and other areas, and now I'm in the NMSS risk group, addressing risk-informed regulation. DR. GARRICK: We've heard of that. MR. DAMON: Again, the purpose of this -- doing an ISA is not the same as a risk assessment. It is a regulatory mechanism, as opposed to an attempt to assess what the risk is. It's an attempt to induce the licensees to do systematic safety analysis, identify what they're relying on for safety, and make a determination that it's adequate, and it's based on the OSHA process hazard analysis concept that's been implemented in their domain and for which they have about 100,000 licensees which they require do this type of analysis, and it's mostly qualitative. Occasionally, people will do what is a PRA, and these -- so, the attempt to bring a risk structure into this ISA is to specifically ask that consequences and likelihood be addressed separately and in the manner that it's described here, and that is to ask that all the accidents identified in the ISA -- that the consequences be calculated and determine whether they are -- in which one of these categories they are, below unlikely, unlikely or -- these are also referred to as intermediate consequence and high consequence. So, the accidents that are identified are going to be determined to be in one of these categories by quantitative calculation, but then, likelihood -- we do not expect the current licensees to, in general, quantify likelihood when they take the next step, which is to determine whether it's highly unlikely or not. DR. APOSTOLAKIS: You do not expect them to quantify it. MR. DAMON: We do not. One of the reasons the word "PRA" wasn't used -- I mean it's used in the Standard Review Plan. There's a statement in there that it is one acceptable way of doing things, of meeting the rule, but the licensees vigorously resisted. DR. APOSTOLAKIS: Do you think that people will have a common understanding of what a highly unlikely sequence is? MR. DAMON: No, and that's why, in the Standard Review Plan, I attempted to provide some guidance to our staff reviewers and, indirectly, to the industry as to what we believe. DR. APOSTOLAKIS: I don't remember that. Can you give me some idea of where that guidance is? MR. DAMON: The Standard Review Plan is -- it's not here. Along with the rule, there's a Standard Review Plan for reviewing the license application or an ISA summary when it comes in, and in there, there's a chapter on ISA. Chapter three is on ISA. In that chapter, it will have -- it has acceptance criteria for likelihood. In other words, there will be a likelihood evaluation done by the licensees as a required element of the ISA. So, they will submit what they think highly unlikely means, and in the Standard Review Plan chapter, it says what we think it means. DR. APOSTOLAKIS: So, is there a place where it says clearly highly unlikely is this? MR. DAMON: If you look -- DR. APOSTOLAKIS: Yeah. MR. DAMON: -- in the slides -- DR. APOSTOLAKIS: Oh, yeah, you are quantitative, on 3.0-28. Highly unlikely, a frequency of less than 10 to the minus 5 per accident per year. So, with tears in your eyes, you're back to PRA. MR. DAMON: Yes. Yes, that's right. DR. APOSTOLAKIS: And then unlikely is a frequency of less than 10 to the minus 2 but more frequent than 10 to the minus 5 and not unlikely -- I guess that's likely -- is more frequent than 10 to the minus 2. So, you are becoming quantitative. MR. DAMON: Yes, right. There's quantitative guidance in there, and in fact, in the section -- the more recent version of the Standard Review Plan is slightly different than what's there, because it was recognized that the number of accident sequences that are identified in these ISAs is under the control of the analyst, he can partition his trees more finely or more coarsely, and so, to preclude playing games with what's the frequency per accident sequence, which is the way the rule is stated -- the rule is explicitly stated, it's each event must be highly unlikely, and so, to preclude that game-playing, the guidance says that you divide by the total number of accidents in the entire industry to figure out what's an acceptable number. DR. APOSTOLAKIS: Divide by the total number of accidents? MR. DAMON: You can imagine all the different ways you could do this, and I thought of them, and there's no easy solution to the situation of proposing a risk goal unless you do it cumulatively, and then you have to be quantitative. DR. GARRICK: It seems that we are kind of playing games here. It seems, by the time you do a good job of answering the questions that you're asking in the rule, you have built the basis for a risk curve. DR. APOSTOLAKIS: That's correct. DR. GARRICK: You have your CCDF. All you need to do now is decide what the uncertainties are. But it's kind of strange that we're in this situation where we have to dance around PRA so much because of the stigma associated with it, or for some other reason, and call it other things. There's no way you're going to be able to convincingly analyze a chemical plant and answer these questions without essentially having the critical points on a CCDF, is there? MR. DAMON: Well, I tend to agree that, in many cases, you will look at a case that's being analyzed and you won't be able to decide, you know, whether it's highly unlikely or unlikely without doing something quantitative, but -- DR. GARRICK: I also get very nervous when we start talking about separating consequences from likelihood. Most times when we do that, we get in trouble, because we get consequences out there which people pick up as if they're not unlikely, and the one advantage of a risk form for the results is that you can't do that. The risk is -- combines the two and forces consideration of them in combination. But anyway, we're a bit late in our commentary on this. But it does have some underlying problems that indicate that there's still a long ways to go in the whole arena of risk communications before we can make the transition that is a clear algorithm for becoming risk- informed. MR. DAMON: They do, in fact, use PRA on fuel cycle facilities in Europe. DR. GARRICK: I know. DR. APOSTOLAKIS: Where? DR. GARRICK: In Europe. DR. APOSTOLAKIS: No, even here. I was reviewing some PRAs that were done for DOE 10 or 15 years. DR. GARRICK: Yeah, DOE did a PRA on the ICPP, the chemical reprocessing plant. DR. APOSTOLAKIS: It was a PRA. DR. GARRICK: Yes, it was a bona fide PRA. DR. APOSTOLAKIS: John, you said we are late. What stage is this at now? You're sending it up when? MR. SHERR: We've gone through the proposed rule process, and the rule is due to the Commission May 15th. The final rulemaking package is due to the Commission May 15th. DR. APOSTOLAKIS: You've been through the public comment period and everything? MR. SHERR: Yes. DR. APOSTOLAKIS: Wow. Why are we involved so late? We don't exist before. Did the ACNW have a chance to review it? DR. GARRICK: No. DR. APOSTOLAKIS: No? MR. HORN: I suppose we all had a chance to review it if it was out for public comment, but we didn't do it. DR. APOSTOLAKIS: Well, that's something for us to discuss. DR. GARRICK: Yes, I think so. Carry on. MR. DAMON: This is one point that should be made. The jurisdiction of the NRC is restricted to certain things. Not all chemical accidents are within our jurisdiction to address in this rule, so it's only certain things. The ISA -- the rule defines this term, "item relied on for safety," and it's important to recognize why the different terminology was chosen. It's because most of the things relied on for safety in these plants are procedural. They're what we call administrative controls. There is hardware involved, but it's usually hardware operated by somebody. DR. GARRICK: But Dennis, is the NRC worried about anything other than radiation risk? MR. DAMON: Yes, chemical death or health effects due to chemicals. DR. GARRICK: I know they are, but I mean really. The foundation of the regulations -- MR. DAMON: In these plants, yes. Like Ted was saying, the only person that's been killed in an NRC plant regulated in the fuels materials area was a chemical death. DR. GARRICK: Yes. MR. DAMON: And so, we are concerned about it for that reason. There are many different things that could go wrong and kill somebody in the plant. MR. MARKLEY: Even the radioactive materials are more chemically toxic than they are radioactively in a lot of cases. MR. DAMON: So, it is a real concern. We've hired chemical engineers and chemists and we're seriously concerned about chemical safety in these plants for the things within our jurisdiction. Of course, the other radiological hazard is having a criticality event, which is -- you know, in light of Tokai-Mura, that's a real thing, too. DR. APOSTOLAKIS: Do these ISAs exist now? Have they done them? MR. DAMON: Yes. There six, seven fuel cycle facilities. DR. APOSTOLAKIS: They have done these ISAs? MR. DAMON: What happened was the staff attempted to get -- when it was determined that they wanted the licensees to do them the staff tried to get the licensees to do it without a rule, and some agreed to do them and some point blank refused. So, some have done them. Only one of them has really done it to near completion, and that's BWXT, which is a naval fuel fabricator. DR. APOSTOLAKIS: And that will be submitted to you or has been submitted? MR. DAMON: It has been submitted, but it's been disguised. It was submitted as what's called a Chapter 15 of their license application, which is a description of their plan, but it actually is a summary of their ISA. DR. APOSTOLAKIS: I wonder whether a way around the problem we're facing here is to recommend to the Commission to issue a -- can they issue generic letters in this area and ask for identification of vulnerabilities and let the industry then discover by itself that they really need a PRA? MR. MARKLEY: That's what they did with 88-20, George. DR. APOSTOLAKIS: For reactors. MR. MARKLEY: Yeah. DR. APOSTOLAKIS: So, what I'm saying is would it make sense to recommend something like this here, because unless you do it, you will never be convinced that you need a PRA. So, you will ask them to identify vulnerabilities and let them do it any way they like, and eventually they will all do a mini-PRA. Then we will not try to derail this. DR. GARRICK: Just as a matter of curiosity, do we have a sense of the scope of the six or seven ISAs? Are these -- MR. DAMON: Yes. I mean I'm familiar with how much they've done. DR. GARRICK: Are these 10 man-year studies or one man-year studies or what are they? MR. DAMON: I would say that the BWXT one is easily up in the 10 man-year category, and most of the rest of them will be approaching that. DR. APOSTOLAKIS: Is it because they are doing it for the first time? MR. DAMON: I think it's a couple reasons. One of them is the number of processes for which you have to do analysis. Instead of a single machine, you know, reactor that's basically a fairly simple machine, they have -- like a typical plant might have 100 different processes, and each one of them is a unique piece of machinery that has its own design and own safety design. So, just the sheer number of things they have to analyze is one factor. The other one is that they -- you have to do it with a team -- a chemical expert, a criticality expert, a PRA-type expert. By the time you're done, you've got four or five people sitting there working on this simultaneously. So, it tends to be expensive. DR. APOSTOLAKIS: I wonder whether this subcommittee should actually spend a day on these things. DR. GARRICK: Well, what I was thinking, George, before we go too far on this, it might be very constructive for us to get a presentation on one or two of these ISAs -- DR. APOSTOLAKIS: That's what I mean. DR. GARRICK: -- a specific presentation of an ISA and to get a more in-depth sense of just the nature of the analysis and the depth of the analysis. That's something we may want to talk about. DR. APOSTOLAKIS: Yeah. MR. DAMON: I thought about that before I came here. I thought about bringing slides that showed some of the -- extracted from some the existing ones, but they're all -- they're classified as proprietary information, so we'd have to address that somehow. DR. GARRICK: Yeah. DR. APOSTOLAKIS: Won't be the first time. DR. GARRICK: Yeah. Right. MR. DAMON: Yeah, I agree. DR. GARRICK: What about the naval facility? MR. DAMON: It's quite easy to find stuff that's not classified. It's just they'd have to pick ones that didn't have anything that they didn't want their competitors to know about, maybe. DR. GARRICK: Right. Well, I happen to know that a few pieces of the Sequoyah Fuels facility was analyzed on a probabilistic basis. So, there's pieces and parts of probabilistic fuel cycle analysis around. MR. DAMON: Right. For example, the NFS facility in Irwin, Tennessee, a few years ago -- they wanted a license amendment to have a process for down-blending highly-enriched uranium down to reactor levels of enrichment, and that's a hazardous operation, because normally the way you assure safety is to make the geometry of the piping and vessels small enough so that the high- enriched won't go critical. Well, they wanted to do it in a way where eventually they're going to get to a geometry that would be critical if they were high-enriched. So, it was such a touchy thing -- they did a quantitative PRA of that process design, but see, a lot of these processes, the safety design is so incredibly simple that even calling it an analysis is hard to do. For example, like at BWXT, they fabricate metal reactor cores for submarines, and the fabrication processes are working with a big machine shop, they work with big pieces of metal, and the typical way they assure criticality safety is simply to have a rack that holds only so many piece parts at a certain spacing, and that's what they work out of. They take the piece out of the rack, they work with it, they put it back in the rack, and the rack is -- it would probably take, you know, 10 times that much or six times as much as in the rack to be capable of being critical if you took it out of the rack and assembled it into a form. So, it's just a sheer safety margin and the way they work with things that they control criticality. As long as people that are working there follow the rules, they are very, very far from criticality, and so, you can see it's a lot of human reliability, is what it is. DR. GARRICK: I think one of the things that would make a lot of us feel a lot better -- and I've read this to some extent but not in great detail, because it's awfully thick -- I'm talking about the rule -- is that if there was more made of the fact that a PRA is an acceptable and established approach for carrying out the integrated safety analyses, the fact that that was totally, from what I read, excluded, I think, is a missed opportunity. MR. DAMON: Well, it certainly -- in the Standard Review Plan, it's quite clear that it's not only an acceptable way, it's -- in the area of identifying accidents, there are statements made that, for a complex process, fault trees should be used. I mean our experience in reviewing the parts of ISAs that have been submitted is that some of the licensees attempt to analyze a system that clearly calls for fault tree by a more simplistic technique that is too vague and does not really explain what accidents can occur in the process, and so, we have made it -- tried to make it clear in the guidance that that's what's called for, and in fact, NUREG 1513, which is the ISA guidance document -- it has a flow chart in for selecting methodologies that would drive one to choose a fault tree for an appropriate complex process. DR. APOSTOLAKIS: Do we have this NUREG? MR. SHERR: It's part of the proposed rule package. DR. GARRICK: The guidance document. MR. DAMON: ISA guidance. DR. APOSTOLAKIS: The NUREG is here? MR. SORENSON: It's in that package I sent you. MR. DAMON: So, there's two guidance documents. Actually, there's three, which I was going to get to. There's the ISA guidance document, which primarily addresses the overall architecture of an ISA and how you -- and all the different methods for identifying accidents, like fault trees, event trees, what-if analysis, different things. Then there's the Standard Review Plan, with an ISA chapter. That has acceptance criteria in it and suggested format for presenting results. The other one is NUREG 6410, which is the accident analysis handbook, which is consequence evaluation methods for fuel cycle facilities, both chemical and radiological. DR. APOSTOLAKIS: There is something that is not clear to me. You mentioned that there is certain guidance in the Standard Review Plan, so at least, you know, the licensees will know where the stuff is coming from. Isn't that the job of a regulatory guide? The Standard Review Plan is for internal use. Is there a regulatory guide here? MR. MARKLEY: The Standard Review Plan is publicly available, George. DR. APOSTOLAKIS: It is publicly available. MR. MARKLEY: The NUREG is available for the licensees' use. DR. APOSTOLAKIS: How come there is no regulatory guide? MR. SHERR: Historically, we have had a standard format and content guide as a companion document to the Standard Review Plan, and the decision was made at some point that, in many respects, those two documents are redundant and that the standard format and content guide doesn't provide as much detailed information. So, basically, the Standard Review Plan is also -- serves also as a standard format and content guide. DR. APOSTOLAKIS: Does NMSS issue regulatory guides? MR. SHERR: Yes. DR. APOSTOLAKIS: But in this case, the Standard Review Plan really plays that role. MR. SHERR: Yes. MR. DAMON: Yes. That was an explicit decision. I remember when they made it. They said, you know, we're going to make the -- in fact, it's just a general policy not to try to do this type of guidance with regulatory guides but to either do it -- it's either in the Standard Review Plan or it's a NUREG, one of those two. MR. SHERR: The Standard Review Plan has been in development for about as long as the rule has been in development, since '93, and it was published as part of the proposed rule, and in fact, we received more comments on the Standard Review Plan than we received on the rule itself. DR. APOSTOLAKIS: If it's been in development since 1993, how come it hasn't been reviewed internally by the ACNW? This is a rule. MR. MARKLEY: I can't tell you how many iterations it's gone through, George, and certainly I wasn't party to the ACNW deliberations. I don't know what has happened and what hasn't and when the opportunities were and weren't. You might ask some other players where that occurred. DR. GARRICK: Are you aware of anything, Rich? MR. MAJOR: I'm not aware of anything. The ACNW has not focused on fuel cycle facilities, especially fuel fabrication plants. DR. APOSTOLAKIS: I see. DR. GARRICK: Our charter has changed. Our charter certainly includes such facilities, and so, we're not maybe keeping up with due process here. DR. APOSTOLAKIS: Okay. DR. GARRICK: But I am struck by the fact that, when I look at SECY 99-147 and I look at the background paragraph that talks about a near criticality incident, etcetera, etcetera, has prompted the NRC to evaluate its safety regulations for licensees that possess and process large quantities of special nuclear material and so on and so forth. The staff concluded that, to increase confidence in the margin of safety at a facility possessing this type and amount of material, a licensee should perform an integrated safety analysis, and then it goes on to say what an ISA is, and no reference really to risk assessment in the traditional sense. So, this obviously has a chemical heritage to it, and is quite separate from picking up on the NRC legacy of advancing towards a risk-informed regulatory practice that has been largely influenced by, to be sure, reactor applications and more recently, performance assessment applications of nuclear waste. So, this middle ground here of large fuel cycle facilities, fuel facilities and processing facilities and UF-6 conversion facilities has kind of been in a vacuum as far as getting lots of attention from either the ACRS on the one hand or the ACNW on the other hand, and given that it's rooted in the chemical field and that the chemical industry has been driving it for the most part, it's gone a different direction, and I don't think it's anymore complicated than that. But I think this has been very constructive. I think this has helped us focus a little bit on an area that maybe, you know, the advisory committees have -- DR. APOSTOLAKIS: So, you are responding to an SRM dated December 1, '98. Is that really what this is? MR. SHERR: No, I think it was July 8, '99. The package you have is the proposed rule package. An SRM was issued as a result of that on, I think, July 8th. DR. APOSTOLAKIS: Of last year. But the whole history started a long time ago. MR. SHERR: It started in '91. DR. APOSTOLAKIS: So, an extra two years wouldn't hurt, would it? Okay. DR. GARRICK: All right. Let's go ahead, if we haven't completely disrupted. MR. DAMON: I'll go through this very fast, and then just stop me when you want to talk about something. This is what an ISA does. It uses systematic methods to identify hazards, namely just where in the plant are there things that are hazardous, because that's -- again, remembering the roots of this, the problem was they weren't addressing chemical safety at all at these plants. They did not have any kind of documented chemical safety analysis, controls, the NRC was not regulating it, and so, we're trying to get a documented safety analysis on the record here, first identify where are the hazardous chemicals and the radiological materials, then do what we'd be more familiar with, fault trees or whatever, identify actual specific accident sequences, and identify the consequences and the likelihood of those accident sequences, and specifically, here's another item that we're interested in, identifying items relied on for safety. This may sound sort of strange to say a statement like this, but if you go to a facility -- even if you look at the old-style documentation that are nuclear criticality safety analyses, which they've done for years, and you ask yourself what things in this process are they relying on for safety, very often, when I've gone and looked at them, they are not documented. They are relying on certain characteristics of the process which is not in the documentation, and without that characteristic -- in other words, you take the GE event. It was a solvent extraction process, and the output from the process went to a holding tank that was safe geometry, it would be sub-critical under any conditions, and the tank was relatively small, and what had happened is they increased the throughput in the plant such that the operator was having to empty that tank about every hour or so because it was too small and would get filled up from the process. Well, there's a big difference in the safety of that -- and so, what they didn't understand was the safety of that process depends on the demand rate, if the guy is under a higher and higher demand rate until finally they had a process upset which caused him to -- there was a control valve malfunction in the process, and during the process when that control valve was malfunctioning, he was still transferring -- he had to transfer every hour the contents of this tank, and so, the process design -- they didn't realize it depended on the demand rate, and they needed to lower the demand rate by increasing the tank capacity, and that's what they did. After the fact, since '91, they've built a lot of very, very, very large safe geometry tanks. So, now they don't have to transfer every hour. They can sit there and wait for days until they do a transfer and make sure it's done right. So, they don't understand what they're relying on for safety, and it's the major thing that I think ISA will accomplish here. They will put it down in writing and they will send us that list of what they're relying on for safety. We've never had that before. We have never had here in this agency a list of what they're relying on for safety in their plants. DR. APOSTOLAKIS: Which is really similar to what the Commission did when the power industry was resisting PRA. They asked them to find the vulnerabilities. Isn't that another way of putting it? And then the industry realized that the only way to do it was to use PRA. I mean it was a hugely successful program from that point of view. It spread the technology, really. I mean the individual IPEs -- some of them are really not very good, but you know, the learning process was tremendous, and that's maybe what we need here. MR. DAMON: Again, it's a -- like you say, it's a process of them learning what they don't know by doing it. DR. APOSTOLAKIS: Exactly. That's very useful background, yes. There is an interesting view-graph a little later that says likelihood -- is that before or after where you are -- likelihood, evaluation, acceptance criteria. I don't know where we are now. MR. DAMON: There's three different things that -- when the staff receives an ISA summary to review, what we're going to be looking for in there is the completeness of identifying all the accidents, the correctness of the consequence evaluations, and the adequacy of the method they use and the criteria they use for judging that things are highly unlikely or unlikely. That guidance is in the Standard Review Plan chapter. The completeness is address by the -- for one thing, the methodologies. Do they use these methodologies that we've told them to use in selecting it with a flow chart in NUREG 1513 and have they applied it each time to each process in a correct manner, and of course, then, we'll have the staff review the results and see if they think that they've picked up on all the accidents. NUREG 1513, like I say, is primarily focused on identifying hazards and accidents. It's got methodologies. It's based on the AICHE red book on complying with the OSHA rule and has a flow-chart for selecting the correct process, and it has a long list and includes actually quantitative PRA as one method in there. DR. APOSTOLAKIS: Yeah, but you know, I've seen the what-if stuff. I mean it was presented to me as if it was a big deal. I think it depends very, very much on the hazards you are talking about. In some instances, you know that the hazards are not very large. Maybe a quick what-if analysis is good enough. It's qualitative. You ask people, knowledgeable people, what can happen here and there. But in light of what this agency has done to promote risk assessments and so on, this is really a trivial Mickey Mouse kind of thing. So, I am not sure that it should be one of the acceptable methods, and the recommendations that come out of it really are, again -- I mean if you're talking about a gas station or something like that and you want to avoid accidents, maybe it makes sense, but not for a facility that has nuclear materials in it, and the hazards -- you know, I mean they have been glorified by the chemical industry, and for us, it's a starting point of a PRA. I mean no PRA analyst will start doing PRAs without doing some form of hazard first, you know, what if this fails, what's going to happen, let me understand the system, structuring the scenarios that we talked about earlier. MR. DAMON: Well, I think that the flow-chart in NUREG 1513 -- that's what it's intended to address. It's intended to prevent them from doing that, and I'm not sure it's going to be successful. We're going to have to arm-twist them into it. We're going to have force them. The staff is going to have to tell them you cannot do what-if on a complex process, you know. That's only appropriate for -- in fact, my own view is it's really just a front-end, it's a screening, brainstorming thing you do on the front-end, and then you go ahead and you do your fault trees and event trees. DR. APOSTOLAKIS: That's right. But this industry is dominated by chemical engineers, isn't it? MR. DAMON: Yeah, I would say mechanical and chemical. DR. APOSTOLAKIS: Yeah. MR. DAMON: There are criticality engineers who do the criticality safety, but they're primarily, you know, reactor physics calculators. They're not -- the discipline that is missing in these plants, in my own opinion, is reliability engineers. They do not have reliability engineering as a discipline in any of these plants. They don't know the subject. So, like I say, if you look at the rule, the bottom line is consequences have to be understood quantitatively. So, they do have to do calculations for that. There's a guidance document on it, this NUREG that was generated a few years ago. It's a summary of the current state of the art in methodologies for calculating consequences, both chemical and radiological. Really, when you look at what the consequence levels are, the intermediate high defined in the rule, you only need a gross estimate in order to figure out where you're at, and then, to know whether you're highly unlikely or unlikely is the appropriate category. One of the things about this is -- about these methods is a lot -- I would say most of the accident sequences will be nuclear criticalities for which consequence evaluation really doesn't need to be done. It's clearly going to be a high consequence event by definition in the sense that you can't preclude the fact that someone might be standing there and might get killed. Therefore, the process has to be protected at the level of highly unlikely. DR. APOSTOLAKIS: Now, let me understand what this means. The applicant defines -- you mean the applicant will assess the frequency, qualitatively perhaps, of sequences and declare them as unlikely or highly unlikely. The applicant will not define what is unlikely the way you do in the SRP. That's your job. MR. DAMON: Right. Well, the way we refer to it is he can establish a method which has criteria in it for what constitutes highly unlikely in his view, and the staff will then have acceptance criteria in the Standard Review Plan as to whether we accept that criteria. DR. APOSTOLAKIS: You mentioned in the SRP that highly unlikely means 10 to the minus 5 or less. Now, what if an applicant comes and argues that 10 to the minus 4 is still highly unlikely? Is that what this means, defines? MR. DAMON: Yes, it does. DR. APOSTOLAKIS: This is kind of unusual, isn't it? I mean this should be the job of regulator, what is acceptable. MR. DAMON: Yes. That's what I say. The Standard Review Plan says what we regard as acceptable. So, there's no point to them coming to us and telling us 10 to the minus 4. We've already said -- DR. APOSTOLAKIS: So, the applicant really doesn't define it. DR. GARRICK: Likelihood has no relevance except when associated with a particular consequence. MR. MARKLEY: George, I don't think it's appropriately defining; it's categorizing. DR. APOSTOLAKIS: But that's my concern. Does the word "define" mean categorize or actually define what's acceptable, what's unlikely, high unlikely, because if it's the second, I think it's the job of the NRC to do that, not of the applicant. In other words, the applicant should be assessing. In other words, they should be saying yes/no. MR. MARKLEY: I don't know that I agree with that, George, because I mean I still think the licensee has the obligation to determine it and the NRC the obligation to confirm or determine whether they would agree or not. DR. APOSTOLAKIS: But the applicant will never tell you what's acceptable. MR. DAMON: Let me clarify why you get in this dilemma. If we were doing it quantitatively, there would be no -- if we were going to insist that things be done quantitatively, there would be no question about, it would be very simple. DR. APOSTOLAKIS: Yes. MR. DAMON: But since what we anticipate is that they won't do it quantitatively, what they'll do is do it qualitatively, if you try and define a scheme by which they're going to categorize controls by their qualities, they're going to have to explain to us what this scheme is by which they're going to evaluate something and say, yes, this is highly unlikely because it has this characteristic, this characteristic, this characteristic, therefore it's highly unlikely. DR. APOSTOLAKIS: But that's an assessment, that's a categorization. MR. DAMON: Yeah, it's a categorization, right. DR. APOSTOLAKIS: That is okay. MR. DAMON: And we're going to ask them to do this categorization and then tell us which ones of these do you consider to be highly unlikely. It's a misleading term. The natural interpretation is it should be a number. So, they're going to develop some kind of method. I've divided the potential methods into three. They could do it quantitatively. He could do a PRA, and they could, therefore, define a likelihood by a frequency per year or some analogous quantitative measure. The other extreme is purely qualitative by the characteristics of whatever the process design is. The one in between is the BWXT method, the method they used, which one way of describing it is to say it's an order of magnitude, a quantitative, or you could say it's a qualitative method that assigns index values to likelihood. DR. GARRICK: They do a lot of this in the marine field, as well, particularly in the offshore. They use exactly the same language, and they have relied heavily on what they call indexing methods, and the UK rules that have come out with respect to safety case requirements for offshore platforms have adopted a kind of a similar set of descriptors and terms, and all of that does have its roots in the chemical industry. MR. DAMON: As I said, this index method is the method that BWXT used, and it's what we would like the others to use. It remains to be seen if they will do it. One of the problems has been, as I said, BWXT's submittal is proprietary. They, in fact, wouldn't even submit their methods document. So, I don't even know the scheme that they used, all I have is the results, and so, when they -- of course, that will change when the rule goes into effect, I can demand the methods document, but they were very tight-fisted about this, and fortunately, it seems to be getting away from that. They seem to be working through NEI, working together more, or maybe they're going to iron this out and they will come to use BWXT's method. Like I said, a quantitative method is not required, but it's permitted, and that's the way it's stated in the Standard Review Plan. This index method is the one that's actually been used. DR. GARRICK: It would be much better there if you said that it's encouraged, because saying permitted makes it come across as a inferior method to the other methods. MR. SHERR: It is sort of encouraged in the sense of what the Standard Review Plan identifies as an example is this semi-quantitative approach, and so, in that way, that's the only thing that we've identified so far. The industry has been saying they would like us to develop another example that's purely qualitative, and we've been struggling with that. DR. GARRICK: Well, what they're going to discover is the same thing that the NRC discovered when they started out with their IPE program thinking that -- and justifying it principally on the basis that it was less cost, and they have ended up with models that are at greater cost than the kind of full-scope PRAs that were being advocated in the late '70s and early '80s. They'll just make the same discovery eventually, that the best way to do this is, in fact, to build a quantitative model and to be creative about how they build that model in order to achieve the cost control that they want and that all of this other stuff will hopefully go by the wayside, because it's just a way of dancing around and avoiding hitting it on all fronts. NASA has gone through the same thing. They have resisted probabilistic methods from 1959, from the time that GE presented a calculation that found its way in Congress that the probability of getting a man to the moon and back was something like 5 percent, and they got embarrassed by it, and the then-administrator said we will never use probabilistic methods again. Well, they're using probabilistic methods, and they're using them increasingly extensively, and eventually they'll come around. There is now a risk model for the shuttle, and so, unfortunately, it seems to be the way that it has to go, that everybody has to satisfy themselves that they have another way, and I think that, until you answer the risk question, you know, you've not answered the margin question. MR. DAMON: I agree, and my own view -- this resistance to PRA is not the staff. It's not the NRC staff. It is the licensees. They insist, oh, you couldn't possibly do this stuff, and they just don't understand. Like I say, part of the problem is they don't have people that have ever done this, so they don't understand how they could do it. You know, if they had on their staff people who had done it, they would understand, yes, you could do it. DR. GARRICK: Yeah. So, it will come, and maybe this is just as well, that it has to come in this fashion. The marine industry is going through the same thing. When they finally got pushed to the wall and they did the Prince William Sound study, which was pretty close to being a full-scope probabilistic risk assessment, and they've learned a tremendous amount from that study, much more than all the rest of the studies they've done, put together, about risk. So, you know, it seems to be a pattern that is unavoidable, and we're going through it now in the chemical field. MR. DAMON: I agree. What I see when I see the analyses, unless the system is extremely simple, what will happen when they use these non-quantitative methods is they will simply get the system model wrong. In other words, they're not -- because they're not used to formulating what I would call a quantitative reliability model, they don't understand the equation, and if you don't write the equation down, you don't know what you're relying on for safety in the system. So, they get everything wrong. They don't succeed in identifying -- for one thing, when a redundant system -- they don't properly understand the virtue of limiting the down time by surveillance method or having fail-safe equipment or self-announcing failure. You know, the idea that when -- if you have two -- a redundant system and you get the first failure, you've got to be aware that that system is in a vulnerable state so that you can render it safe. You've got to limit the down-time of that initial failure. They don't understand that, so they don't do anything about it in some cases, and in some cases they do, they've figured it out by just sort of experience, that, you know, you need to have something to recognize when the machine breaks, but they don't understand the concept, they don't understand the math behind it. DR. GARRICK: How are we doing? According to the agenda, we're due for a break in about 10 minutes. Are we within 10 minutes? MR. DAMON: Yes. DR. GARRICK: Okay. Good. MR. DAMON: The trouble with qualitative -- DR. GARRICK: We say this like it's your fault. DR. APOSTOLAKIS: Well, he has been so slow. DR. GARRICK: And we apologize for our interruptions, but we're learning a lot. MR. DAMON: The idea here is how is the staff going to judge when the applicants send in their version of a likelihood evaluation whether this thing all makes sense? My own view is, if you have any doubts, you're going to do your own quantitative analysis of whatever the situation is, but one way of looking at things is to categorize things by the various qualities that you rely on to achieve a high reliability, high availability type of system, and this is a list of the different factors that need to be evaluated, and if the applicant hasn't considered these things, they don't know how to put the whole thing together, then they're going to get the wrong -- an inadequate answer. For example, the thing I was mentioning about limiting down-time is what I'm calling here an availability measure. That's certainly something they need to address. They need to tell you in the analysis what surveillance they're doing on the equipment to detect that it's in a failed state -- functional test, monitoring, operator observation, or what, and of course, independence is related to, you know, things like common cause and diversity, and there's guidance in the Standard Review Plan telling the reviewer to look and see, have they looked support system failures, you know, power supply to the system and stuff like that. If they haven't done that, then they probably haven't got a system that is going to meet the standard for highly unlikely. The reason the number 10 to the minus 5 and that quantitative stuff was put in there is we asked somebody -- we asked one of the applicants, one of the licensees, what would they consider to be a highly unlikely for a single accident to occur, and they said, well, less than once in the life of a plant, and at that point, I realized they didn't understand the concept that we're talking about each accident here, that there's thousands of accidents, and that if it was once in the life of a plant times 1,000, they would be having them every month, you know, and they don't understand things in that way. So, that's why that kind of guidance is in there, if they come in with a definition like that, but what we expect is these qualitative things. These graded management measures are all the things like QA, maintenance, configuration management. We expect them to specify what they're going to do to these items relied on for safety that's going to make them reliable and available and to commit to them in writing and submit that list of items relied on for safety that specifies all these qualities that they're going to maintain, and admittedly, this doesn't look very elegant compared to doing a risk assessment, but compared to where they were, you know, in 1991, this is a big step forward to just get them to list all the items relied on for safety and send it to us. This is the thing that they're accustomed to, and one of the reasons that they practice this concept of qualitative evaluation is it goes back to the fact that the only kind of safety analysis they did was criticality safety analysis, and in that field, the people that write the standards on that, many years ago, came up with the idea of a thing they call double-contingency, and there's the double-contingency statement. It's just redundancy. They're just saying don't rely on any single control, and they learned this the hard way. Back in the '50s and '60s they had a criticality accident about every two years until finally somebody says, you know, maybe we should upgrade the safety of these things, and they came up with this. DR. APOSTOLAKIS: Single failure criteria, right? MR. DAMON: Single failure criterion. So, this is what they have been working to for the last 30 years, and they haven't had a criticality in a NRC- regulated facility in the entire history of the agency and they haven't had a criticality in DOE since the mid-'70s, I believe. That was a shielded criticality. No one has died in a criticality since the '60s. So, there was a sudden improvement when they adopted this concept. So, what we're trying to do is get them to take one more step here, do a little -- DR. APOSTOLAKIS: Do you think it would be a good idea to ask everyone to try to identify plant vulnerabilities? MR. DENNIS: That is part of what is specified. In fact, it's in the rule. They are required -- DR. APOSTOLAKIS: They will have to do it. MR. DENNIS: They are required to identify vulnerabilities and to correct them. DR. APOSTOLAKIS: How much time do they have to do that? MR. SHERR: Four years. DR. APOSTOLAKIS: Four years. And they will submit the ISA to you for review? MR. SHERR: ISA summary, right. DR. APOSTOLAKIS: Summary. It's IPE all over again, guys, It's really all over again. It's IPE. Somebody must love it very much. MR. DENNIS: Because of the number of processes involved, these analyses actually are quite voluminous. It takes about two bookshelves to three bookshelves full of stuff or more to cover one of these plants. It's quite bulky. So the ISA summary, we might get down to half a bookshelf. DR. GARRICK: While you were out, I was pointing out the IPE started out as a simple solution that was increased in scope with such things as external events and later with large early releases, et cetera, et cetera, and until you clearly see that, yes, you could have done a very competent PRA, full-scope PRA for what's been spent. DR. APOSTOLAKIS: We understand all this. DR. GARRICK: It's a little unfair for a George Apostolakis and John Garrick to be ganging up on you on this discipline. We clearly have our prejudices with respect to the use of PRA. Is that it? MR. DENNIS: That's it. DR. GARRICK: That's very good. Well, we are interested in this and see how it evolves, because it does have a very familiar ring to it in the IPE world. So I'm sure there's lots of lessons to be learned there. But I would sure be a lot happier if the rule was not so sensitive against the use of PRA as a preferred option for complying with independent or integrated safety analysis. Very good. Any other -- DR. APOSTOLAKIS: You say that you do have one of the ISAs, the BXT? MR. DENNIS: BWXT used to be B&W, Babcock & Wilcox, only they changed their name and they put an X in it, for some reason. DR. APOSTOLAKIS: It's like LAX. Do you think we can have a subcommittee meeting, a joint subcommittee meeting reasonably soon, where we can discuss the ISA and the details? Because what you gave us today is really a presentation that one would make before -- they will be invited, too. DR. GARRICK: But I think if we heard a presentation of an ISA of a specific facility, then we'd develop a much greater sense of what it's all about and also may be able to make some constructive ties with what is required to upgrade it to a PRA. STAFF: If this is a question directed at the staff, the answer is yes. DR. GARRICK: Yes. DR. APOSTOLAKIS: Would the industry be willing to come? STAFF: That I don't know. We would certainly work with them to see if we could get them to participate. DR. APOSTOLAKIS: If you asked them today, they will come. DR. GARRICK: Thank you very much. We're going to take a 15-minute break. [Recess.] DR. GARRICK: We'll come to order. We are now going to hear about why they sent us this big book on byproduct material risk analysis, and three volumes, and I think you ought to tell us about it. Would you introduce yourself and tell us what you do, et cetera, first? MS. ULRICH: Sure. My name is Betsy Ulrich. I'm a Senior Health Physicist in Region I. I do both licensing and inspection of materials activities. A few years ago, I was asked to take part on the -- let's see, what was it called at the time -- nuclear materials byproduct risk review group, which we shortened to RRG and you may see that in the handout occasionally. That's what I'm here to talk about today, the risk assessment that was done of byproduct material activities. There's three things that I would like everyone to remember about this study. This particular slide is not in your handout, but everything that's on this slide is. The first is that in this NUREG and in our report, you will see numbers, and the use of numbers implies or may imply that there is great accuracy. In fact, the numbers that resulted in the risk assessment for radiological risk have uncertainties that are on the order -- orders of magnitude. So because we have numbers doesn't mean great, great certainty in this case. DR. APOSTOLAKIS: Let me understand that. MS. ULLRICH: Yes, sir. DR. APOSTOLAKIS: The way I read the book is that there is variability on several orders of magnitude if I look at the numbers for one activity from facility to facility or maybe across activities. But if I specify one activity, is the uncertainty orders of magnitude? MS. ULLRICH: Yes, it is. DR. APOSTOLAKIS: And why is that? MS. ULLRICH: Because even among the systems, as we define them, there is great variability, and I will talk about that in the presentation. So if you want to hold that, I think we'll get to it. The second point that I want to make is one that has confused many of the people that I've presented some of this information to in the materials community, and that is our risk values have units of millirem per year. It's because when we defined risk, we said this is the consequence in terms of dose, millirem per year, multiplied by a probability of some event happening. The resulting units are millirem per year. People look at the tables, don't read the word risk, they see the millirem per year, and they said you don't get doses like these in materials activities. And so it's something that I have to stress to a number of people. DR. APOSTOLAKIS: So it's the expected dose. MS. ULLRICH: Yes. And the third is that we did not make any judgment in the study as to what is acceptable risk. That is something that probably needs to be done, but it was not done here. We simply assessed what we believed the risk values to be. These are the topics that I will move through, some more quickly than others, and certainly spend whatever time you want in the different areas. Who we are, what the scope of the risk assessment included, what categories of radiological risk we assessed quantitatively, how we did the risk assessment or how the contractor did the risk assessment, since they did the real guts of that work, and where our uncertainties are, how we used the consequence information, and what evaluations the risk review group did of the NUREG CR information, and what we conclude the results are. The risk review group had two goals, to identify and document a technical basis for a risk-informed approach for regulating byproduct materials. That included all Part 30 activities. So those are activities covered by the regulations in 30, 31, 32, 33, 34, 35, 36, and 39. And to develop a graded approach for regulating them using this risk information. We had five persons on the group; two health physicists, one from the NRC, that was myself, one from the State of Colorado. We also had persons with experience in risk assessment, engineering and human factors. So that was the core group, and our job was to figure out how to cope with this project, to find a contract to assist us with it, oversee what they did, work with them, and try to make some conclusions out of this as to how we can use this information for materials. DR. APOSTOLAKIS: Now, the risk assessment person was from the reactor arena? MS. ULLRICH: John Randall, actually. So yes. As I said before, Parts 30 through 36 and 39, again, I have given this presentation to some other people, so I find myself riding my little hobby horse about you multiply dose times the probability and you get units of dose, but it's a risk value in this study. We defined what we considered discreet systems for materials activities based on similar uses, quantities and forms. This is the obligatory slide that's difficult to read. This pretty much covers the range of materials activities. We did have pointed out to us in the comment period that we missed a couple of items, like nuclear laundries, decontamination services, but I think overall, we've got a good scope there. Why some of the uncertainties are still high, even with a system like fixed gauges, there's a wide range of gauges, there's a wide range of activities used in the gauges, how they're used, where they're used. So there's uncertainty when you come up with one number for that category. One correction to this, and that is system four. It should be nuclear medicine, generator only, and this is because there are some hospitals that use generators, molytech generators, and more and more now do not. So if you wanted to assess the risk for a group that uses a generator, you would add systems four and five. So the four is generator only and then five is all the nuclear medicine involved with administering those dosages to patients. Scope of the risk assessment. For the radiological risk, we looked at what is the risk to workers, what is the risk to the public, what is the risk under normal conditions and off-normal conditions. There's actually another set of categories, which is risk to individuals and an industry-wide risk. It did not include doses to patients. That's real important for people in the medical community to understand. It's also important for the people in NRC to understand, because a lot of our regulations involve misadministration to patients. That was considered out of bounds, out of the scope of this project. So that's not included. We didn't look at transportation, covered under Part 71 or DOT, and we didn't look at developing future technologies. The real nice thing, and this will come up again later, is that along with the NUREG CR, the contractor developed the database that could be used for revising systems or adding new systems, as we have more information about new technologies. DR. GARRICK: Having done this study now, I suspect you'd be able to reduce these 40 systems down to a much smaller number. MS. ULLRICH: I don't think so. DR. GARRICK: You don't think so? MS. ULLRICH: It was real difficult even coping with the variations within these systems. It was an extraordinary amount -- DR. GARRICK: Well, if you were to do it by risk, if you were to categorize it by risk, would you be able to sort a smaller grouping? MS. ULLRICH: I certainly think that you could sort out systems that you wanted to look at in more detail and more rigorously. I think you could select out the systems with higher risks that you might want to find more information about and make it a more certain assessment. DR. GARRICK: Well, if you wanted to dig deeper into those that were most important. MS. ULLRICH: Yes. DR. GARRICK: John, you seemed to have a reaction to the question. MR. RANDALL: Yes. I was just thinking. I didn't stay with the group long enough to get to that point, to get a chance to think about sorting the risk numbers. The group functioned for a while as a group and then suddenly we weren't a group anymore. It was just interesting. MS. ULLRICH: There was some attrition, yes. DR. GARRICK: I found the document to be very useful, and the partial answer to the question that this joint subcommittee asked early on is that you should tell us what you think is important from a risk perspective, and let's start from that point. You've kind of done that now for the byproduct side of the problem. MS. ULLRICH: I think so. These two bullets are really probably the meat of what we had assistance from the contractor to do, getting a quantitative and qualitative assessment of the radiological risk. We also attempted to get a qualitative evaluation of some of the other risks and all of that information is summarized on the matrix handout that some of you may have picked up. So you will see the bottom list of some of the other issues we looked at, regulatory burden, risk of contamination, the cost of decontaminating, non-rad health risk. Those we simply didn't have the time or other resources to pursue that in any kind of quantitative way. So those were qualitative and based on mainly literature research, if I recall correctly. The matrix summarizes all of the numbers and in all of the risk categories. These are the eight risk, radiological risk categories that we looked at. The first four are all the individual risk categories. The second four are industry-wide risk. It's not truly a collective dose. It's looking at the population of the industry only, not the persons that would be affected by that industry, and I'm not really sure what to do with those numbers or how best to use that information. It's simply clear that if you look at industries that are large, such as portable gauges, having many thousands of them out in use, they rise higher level of risk, if you're comparing industry-wide risk, than they do in individual risk. Industries that are very small, the use of pool irradiators, mega curie pool irradiators, you can probably count on a couple of hands and maybe a foot how many we have of those in the United States, very small industry. So as an industry risk, they don't come near the top. So I'm not really sure how we should use that information, but we do have it at this point. The other thing is there's probably larger uncertainties with the industry risk, because now we're guessing how big the industries are, and that's also a difficult thing to do. The method that we ended up choosing for the radiological risk assessment was probably best described as a modified target, hazard -- hazard barrier target analysis, where the hazard is the radioactive source. There are some barriers that may be administrative, may be physical, which protect the source from reaching a target, who is either a worker or a member of the public. The other thing to remember with this study is that a member of the public could be a collocated worker, somebody who is not assigned to work with radioactive material, but may be at the next laboratory bench or may be the flag person on the construction site near where somebody is using a portable gauge. So that would constitute, in this study, a member of the public, not a worker. DR. GARRICK: That almost lines up with level one, two and three. In the old reactor days, we used to talk about the plant model, the containment model and the site model. MS. ULLRICH: Okay. DR. GARRICK: It's the same kind of breakdown. MS. ULLRICH: Yes. Well, what did we have to do? Well, we had to adequately describe the systems, characterize the systems in a way that made some sense, that it would include the main range of radionuclides, the range of activities that were used in each system, and then look at what are the barriers, what kind of shielding is there, what kind of confinement is there, what kind of restrictions to access are there, what's required by regulation, what's a good practice. That took a while to develop, again, because the industries, even within the systems, vary and people use radioisotopes for different things or in slightly different ways. So it's not terribly consistent from user to user and it's less consistent as you go from sealed to unsealed. The unsealed users are all over the board with how they handle stuff. Developed scenarios or sequences of what kinds of things happen when people handle radionuclides and what can happen. Can they pick it up? Can they drop it? If they drop it, does it break? If they drop it, does it spill? All those sorts of things went into the sequence development. Then those sequences, in an effort to simplify it or characterize as event trees, but only in terms of success or failure of shielding or confinement or access. The contractor did not attempt to do that for all the individual events that would comprise shielding or comprise confinement. DR. GARRICK: Betsy, in connection with the development of the scenarios, one of the things I was trying to figure out as I read the document is whether the database, such as the NMED or whatever it is, drove the structuring of the scenarios or whether the physical system and an engineering analysis thereof drove the development of the scenarios. MS. ULLRICH: Where there is a system that would be considered engineered, like a pool irradiator or radiography unit, that certainly drove it. I think work habits drove it, the process people go through in using, looking at it from how do they get the material, how do they receive it, when they handle it, what are they using it for, how do they store it. NMED did not drive that at all. DR. GARRICK: Okay. MS. ULLRICH: Okay. NMED was actually a very limited resource for us, in many ways. DR. GARRICK: I know that and that's why if you were going to tell me that it was an NMED, then you were in for some real questioning. MS. ULLRICH: Okay. Now we start getting into some of the interesting stuff. Determining the frequency of sequences, that is where NMED did come in. Where we had incidents that were required to either be reported and we had NMED information, some of that was used to develop probabilities for the frequency of those events happening. One of the errors in here is that we don't have a good handle on denominators. NMED would give you the number of numerators, the number of reported events that happened, but we don't always have good denominators for these numbers, either. Nuclear materials events database. DR. APOSTOLAKIS: NRC? MS. ULLRICH: Yes. DR. APOSTOLAKIS: The equivalent of LERs? MS. ULLRICH: Yes. DR. GARRICK: Yes, sort of, but a little different. It's a little more compact than the LERs. MS. ULLRICH: It also gathers information from the agreement states, as well, for the materials activities. Once a frequency was determined, we could also calculate the doses and then calculate the risk value by multiplying those. All that information is on computer disk. It's the byproduct material system risk database. It's got a file and a user's guide and that is available to NMSS. They have a couple copies of that, and that contains all the information that is the basis for the dose calculations and the risk calculations. It wasn't a trivial effort. I don't know how these numbers compare to a PRA or to a reactor risk assessment, but to me these numbers were mind-boggling. Fifty-six different nuclides, 518 tasks, 4,000 normal and off-normal sequences and over 27,000 individual calculations, and none of that is in the NUREG. DR. GARRICK: That's a small fraction of one PRA. MS. ULLRICH: Yes. To me, that's a big deal. It was a lot of effort, and I wasn't the one who had to do it. So kudos to the people who did it. DR. APOSTOLAKIS: Now, you're going to talk about the uncertainties at some point. MS. ULLRICH: Yes. DR. APOSTOLAKIS: In the assessment process, when you calculate the consequences, you say somewhere here that you were uncertain about the -- I made a note on it. The risk results are based on an average consequence for the conditions evaluated. There can be significant variability of risk results around the average due to the variations from one user to another, and from one day to another. Would you care to explain? MS. ULLRICH: Sure. Let's take a hospital situation, since many people are familiar with that, and a hospital situation is actually more consistent than many of our other material systems that handle unsealed material. But on any given day, they may have six patients, they may have 16 patients. If they have 16 patients, they're going to be handling more activity than they would on a six-patient day. They may have iodine therapy just one day a week or one day a month. Iodine therapies drove the dose consequences in these risk analogies. So if you have a hospital that doesn't use iodines at all, their risk is going to be much lower than the hospitals that are frequent iodine users. DR. APOSTOLAKIS: So there is an uncertainty, then. MS. ULLRICH: Yes. DR. APOSTOLAKIS: Regarding the number of activities. MS. ULLRICH: Yes. DR. APOSTOLAKIS: Taking place within a period of time. MS. ULLRICH: Yes. DR. APOSTOLAKIS: And the reason why we're interested in that is because if you have N activities, then there is maybe a high probability of human error or something like that, because people have to do many more things. MS. ULLRICH: Yes. DR. APOSTOLAKIS: This is not the typical uncertainty that you find in PRAs. MS. ULLRICH: No. DR. APOSTOLAKIS: This is different. This is, in the parlance of some of us, aliatory. The uncertainty regarding the frequency of the initiator, for example, that's different. That's epistemic. So I'm really wondering how you've coupled those things in your event trees. Do we have the event trees and I didn't look at them? MS. ULLRICH: No. DR. APOSTOLAKIS: Is it possible to get an example of a calculation that will walk me through this kind of thing? Because I'm really curious to see how you combine the two. It's something that is not done normally. MS. ULLRICH: I've got two victims in the back that I can pick on for a response to that. DR. APOSTOLAKIS: Yes. Who are they? Mr. Youngblood. MS. ULLRICH: Yes. MR. YOUNGBLOOD: The right choice is actually Mr. Schmidt. DR. GARRICK: You have to say who you are. MR. SCHMIDT: Bob Schmidt, with Scientech. I was pretty much the lead on the risk analysis. We did not do any uncertainty analysis. Everything was calculated basically as a point estimate of numbers to get the frequency and to get the consequences. There was no uncertainty analysis. All of the statements about uncertainty are judgments as to what the variability might be, what the uncertainty, but there's all kinds of variation, trying to avoid getting into trouble with the right terms. When you analyze the system, the source strengths can be very large within a system. One user can use it one way, one frequency. Another one can stand further. You have to determine how far away from it he stands, how often does he drop it, does it open up, what are the airborne release fractions. There are so many parameters, that we did not do an uncertainty analysis. It was just a point estimate, a best estimate that we could make based on the available information. Some of the data, if you look at the normal risk, that's pretty good, because we have data on normal risk. The radiographers get a risk or an exposure per year which is recorded and which is reported to NRC. On the other hand, some people don't, it's not reported. Accident risks are obviously -- where we knew there was an accident, we would -- and we got a consequence for that accident, we'd kind of go look at what was reported in NMED; are they reasonable or not, yes, but we didn't do any uncertainty analysis. DR. APOSTOLAKIS: Now, when you say you didn't do an uncertainty analysis, I guess you mean you didn't do an uncertainty analysis in the PRA tradition, where, again, you consider the uncertainty in the frequencies of the initiators, the unavailabilities of systems, and so on, the epistemic part. But since here you have aliatory and it makes a difference in the model, I wonder how you would do it. I mean, you could still do it in a point estimate basis, but you still have to consider cases like do they have one patient or six. Is that in the event trees, the number of patients and the consequences? MR. SCHMIDT: There was an assumption about we gather data and try to determine the frequency of use in a - - generally, it was kind of a maximum facility. It was a facility that used -- did this activity, from talking, in a limited sense, to a few licensees and we had restrictions there, how many times would they do it in kind of a maximum case in a day, how many times a year, what was the source strength, and that's the way it was combined. DR. APOSTOLAKIS: Increase in probability of human error because of the simultaneous or concurrent activities was not included. MR. SCHMIDT: No. We didn't get into that. If we had data on how many drops, the issue about there's no denominator and getting a probability of a spill, that would kind of enter in and be taken out, because how many spills did you have in a year, we knew, but how many activities in a year, we didn't know, but we put it into the numerator and in the denominator. The numbers for the industry, though, are better, because we had all the -- if the reporting is accurate, how many occurrences occurred through the whole industry, how many people are out there using them, that's an uncertainty. DR. GARRICK: But it does sound like, A, you considered the use of the facility, the frequency, the frequency of the use, and, B, you considered the nature of incidents per use. MR. SCHMIDT: Yes. DR. GARRICK: And that frequency was probably influenced by the use frequency, was it not? That is to say, if it was extensively used, you might make an adjustment for there being more incidents per use than if it's used infrequently. Did you make any of those kind of adjustments? MR. SCHMIDT: I can't recall any case where we made a judgment that a facility that did a lot of this work had a greater or a lower frequency of occurrences than one who did just a few. We did not get that sophisticated. We had enough trouble getting one number, much less a -- DR. GARRICK: But that would be one place where you would come close to an initiating event analog, is if you considered those kinds of things. DR. APOSTOLAKIS: It still would be very illuminating to see a detailed analysis for two or three cases, different cases, if we can get that. MS. ULLRICH: We have the system 24 walk-through, which is probably about as close as that would get, Bob. DR. GARRICK: Okay. Go ahead. MS. ULLRICH: Okay. DR. GARRICK: I found this study very interesting, by the way. It was very informative and illuminated one of the issues that is very important in the materials field, and that is the contribution to risk of operations versus accidents, and sort of rather dramatically indicates the great differences here between this problem and, say a reactor problem and the presentations of that, even though it was very qualitative and non-uncertainty, was very helpful in a risk communications sense. MS. ULLRICH: Okay. That's good. Since we've talked so much about uncertainties, I don't think we need to spend too much time on this, but I would say there's uncertainties, or perhaps variabilities or variations is the better term here, in what we know about the systems. DR. APOSTOLAKIS: This is a very important point, because there are two kinds here that appear to be very important. One is this randomness in the number of uses or other things and then this other one is of the state of knowledge type, where you really don't know the numbers. In a PRA, when we say uncertainty analysis, we mean the second. MS. ULLRICH: Yes. DR. APOSTOLAKIS: Because everything else is frequencies, exponential distributions, straightforward. So I would really be curious to see how you guys handle this or in the future perhaps how you will handle it. MS. ULLRICH: Variabilities in the information we had about the systems, variabilities in the dose calculation. Probably external dose calculation is of better certainty than internal dose, but is still the choice of how far away is the person, how long do they spend there, that sort of information. We use the alloys from Part 20, which is ICRP-30 based. We've had people argue with me about that shouldn't be the best way to do it. As far as internal dose goes, one of the conversations I had with Charlie Minehold was how good are internal dose models, and he said, well, for tritium, you're probably good for a factor of two or three; for everything else, it's up in the air. So there are uncertainties in the dose modeling. Just because we can crank numbers in a computer doesn't make them particularly good numbers all the time. The third category is uncertainties in the frequency or probability, likelihood, whatever you want to call that term, and that's what you were talking about before; how often do things happen, how well do we know that. So those were the -- DR. GARRICK: You talk a lot about uncertainty, but you don't do anything about it. MS. ULLRICH: It was really -- I'm certain that I didn't know enough about it. There was nothing we could do rigorously for this. DR. GARRICK: I understand. MS. ULLRICH: There's too many unknowns and given the time we had. DR. APOSTOLAKIS: We should move on. MS. ULLRICH: Well, then I'll do that. One of the things that I do want to emphasize is that we have the dose calculations first, the consequence, and that information, looking at where do you get the big doses, was what -- was the information that was used to say what things do we have to regulate and make sure are in place, what barriers have to be there with very high assurance, with high assurance or with moderate assurance. So when the NUREG document talks about where our regulatory options should be, what areas should we regulate, it's based on what things gave you big dose. And that, I think, is important for people to understand if they're going to use this document. The risk numbers alone don't tell you everything. Where the doses are tells you what barriers need to be in place. Not a big surprise for the sealed sources, one of the barriers that has to be there with very high assurance is the integrity of the source encapsulation. That's something that's a function of the manufacturer, not the person who uses the source. It's got to be manufactured right to begin with and then they can begin to abuse it and run them over with steamrollers and the other interesting things that happen with some of our sources. For most of the unsealed materials, the major barriers were prevention of loss or abandonment, and that's a big one for the sealed sources, as well, as anybody who has been in any of the steel industry meetings will hear about, and preventing accidents to the material, and both of those involve a lot of administrative barriers, as well as physical barriers. The risk review group did also do a survey of inspector and licensing personnel. We had hoped originally to get some information that would help us with the -- the contractor with the risk assessments. It didn't get done quite in time. They ended up not being able to use any of that information, but then I used it as a sort of test, is the opinions of inspectors and license reviewers about the safety of these different systems and how frequently they have accidents and what kind of doses they get, in line with what the NUREG calculations showed. On the whole, it was pretty good. There were some glaring differences, but. These were the kinds of evaluations that were done with the material and all of this is reflected in some of the tables you have, selected tables at the end of the handout. Let me deal with that first issue, and that is looking at the issue of do you look at two significant figures, do you look at one significant figure, or if the uncertainties are really on orders of magnitudes, do you look at these numbers in powers of ten. I ended up choosing to use the one significant figure column, because it was the most comfortable one to use for me. The powers of ten made the range of categories so broad that I really didn't feel like I got a handle on it. So in all my other evaluations, I used the column of one significant figure. This is the risk to individual workers under normal conditions. This is the one which you see the highest risk. The nice thing here is that you can compare this to real doses that people get in a typical year, because you would pretty much expect them to use material properly all the time and get those kinds of doses. We don't typically see doses of 3,900 or 2,000 rem in a year. Even for the radiographers, it's not terribly normal to be that high anymore. DR. GARRICK: Thirty millirem, right? MS. ULLRICH: Yes. I'm sorry. What's a factor of a thousand between friends, right? No. It's a millirem. I also ranked the systems just to see how they came out and that has its interesting aspects, as well. Here, the numbers are simply the ranking from one to 47, because as you see, we ended up breaking some of these systems into smaller portions. You had asked earlier if we could consolidate it down more and what we found is we were breaking it into smaller pieces. It jumps out right away that we got a lot of low numbers here for radiography at field sites. That's overall a risky activity, but for some of the other systems, you can see it really jumps around, depending on what risk you're looking at. So I didn't see any point in coming down to a single number of risks for a system because it hides where you might want to make some regulatory changes. DR. GARRICK: I think that's excellent. I think that was a very wise decision. MS. ULLRICH: Let's see. One of the other evaluations we did was to look at how the risk of the systems stacked up against the way we inspect them. Materials is a little different from reactors in that we have categories, priority of inspections. Priority one gets inspected once a year, priority three every three years, priority five every five years, and they're unannounced inspections. I'm not sure I really see a pattern in there. On the other hand, it's also clear that we have looked at the consequences in a lot of the systems in determining who needs a lot of attention. Radiographers, pharmacies, these people get looked at every year. DR. APOSTOLAKIS: Would it make sense now to inspect teletherapy every three years and nuclear pharmacy once a year, when you have 50 millirem versus 800? MS. ULLRICH: Well, that's something that needs to be looked at. It plays into how good of a dose number. This is only in order by risk to the individual workers. It's not looking at any of the other risk categories here in this comparison. So there may be other things driving it and one of the other things that I have to say is that the nuclear medicine people are just up in arms about how high that number is. It can't possibly be that high and we got to go back and look at that again, and, in fact, that may be what we need to do. DR. APOSTOLAKIS: Just like the reactor people. MS. ULLRICH: Yes. It's interesting to look at. This is something that we need to check and it may be a place to use some of this information. The numbers in parentheses are for broad-scope programs, which may have multiple of these activities. Under a broad-scope program, they would get inspected every year. What are the overall results? I don't believe that the risk value alone is sufficient to regulate. There are some byproduct material systems, like the pool irradiators, that can produce lethal doses of people get past the barriers, and so you have to make sure that you've got regulations in place for those kinds of things. There may be other things that have to be considered, as well. But I think the risk value does have its place. One of the good things, we did not identify, in the consequences, in those doses, anything unexpected, anything that we didn't have regulations in place to handle already. Lots of general trends, and I think you've already talked about some of these, that the risk is to the worker, the public risks are much lower, the accident risks are much lower. So I don't think we need to go over that in a lot of detail. To address that issue of comparing the risk value to the inspection priority, I probably graphed that three or four different ways just for the worker risk and really didn't see a particular pattern. But I'm not sure that it's the fault of either system right now. The inspection priorities are based on program types, rather than systems. The systems are different from the program types that we use in NMSS. It may indicate, where there's big differences, that some additional study needs to be made to see why what we think -- why we need to inspect frequently is either not justified by the risk value or may need to be changed. And the most important thing, and this comes up with the strontium-90-I applicator, which came into the bottom of the risk list in every category, just about, the risk assessment, as we said earlier, didn't include risk to patients. The strontium-90 applicators really don't have much harm to workers or people, but there have been some whopping doses to patients when they were misused or, in one case, there was a series of misadministrations because they had the calibration incorrect and the decay curve incorrect for the use of that device, and that's something for the Part 35 people to think about, I guess. Systems with low risk values we can probably look at to consider for reduction in regulatory burden. I think these other points are discussed already. I think it's real important and I'm kind of glad to hear some of the favorable comments on this report, because the materials community is picking it apart, for the ones who do see it. It's a first attempt to do this. As far as we could tell, we couldn't find anything of this scope before. There is certainly some risk assessment done of some individual material systems. I think there's a PRA for the gamma knife that was done. But this scope is a new thing. I think it's got a good solid basis. It could be extended, if we want to extend it. It could be used for new systems, if we need to use it for new systems. It's a consistent way of comparing the different activities. So I was really pleased with the NUREG and the risk database that we received. Where do we go from here? I think some of the information from the NUREG, the risk database, the risk review report, the matrix summary can be used in developing the safety goals for the NMSS activities. It may be some resource information to help decide what pilot activities would be selected, what systems, what kind of byproduct material activities do we want to look at for more rigorous risk assessments. It can be reference information to risk-inform changes we might need to make in regulations, current guidance, new rule-making. I believe we've already got a commitment for the licensing and inspection guidance document, the NUREG-1556 series. Those start under revision this year and those teams should be using this information as a reference to make those revisions. And I already talked about how I think that risk database is a useful tool. So that's what I have for you today. Are there any other questions at this point? DR. GARRICK: I'm sure there are. What do you see as the next move? MS. ULLRICH: Well, we're always working on something and the NUREG-1556, I know those revisions are starting, so I would certainly like to see this involved with them, and I would personally hope that this information is able to be used by the folks working on the SECY-99-100 activities. DR. GARRICK: Have you been tracking at all the changes that have been made on the reactor side in NRC oversight and the way the maintenance rule and the IPE work has influenced the overhaul of NRC oversight for reactor inspection? MS. ULLRICH: I would say that I've seen some of the trickle-down effect from that. DR. GARRICK: Well, I see a similar opportunity here. You've touched on it, with trying to matrix the results with the inspection frequency and duration, I guess. But I would think this would be an important starting point for really examining the efficiency and effectiveness of inspection. Meanwhile, this analysis would also come under some increased examination and criticism and that would be constructive, as well. So I hope that there is some consideration given to that. Hornberger? George? We can just encourage you to keep up the good work. There is no question that in the details, you can find problems with the analysis, but from a top-down perspective, it's a giant step forward in understanding what is going on in the byproduct arena. If we could do something similarly for the other major categories of the materials side, and there is movement in that direction, then I think we would begin to see a risk perspective of the entire materials side of the business. The performance assessments are certainly doing that as far as the repository issue is concerned. The dry storage is doing some of that as far as fuel storage is concerned. Then, of course, we heard about fuel fabrication facilities and fuel cycle facilities. So it does appear that there is movement and I guess the only anxiety that some of us have is we would like to see more commonality, if you wish, of the methods, because there does seem to be -- DR. APOSTOLAKIS: And terminology. DR. GARRICK: Yes, and terminology. There does seem to be a determination on the part of each of these four or five groups to at least have a certain amount of tools that are unique to that particular section, and I'm not sure that's necessary. But the progress that's been made is still very encouraging. Any questions from staff? Comments? Thanks, Betsy. That was very helpful. Well, what I think we ought to do now is spend a little time asking what we ought to do with what we've heard today. I suppose that to be orderly about it, we ought to just go to the top of the agenda and work our way through. DR. APOSTOLAKIS: I propose we don't write a letter on the opening remarks. DR. GARRICK: I'm glad to hear that. All right. The first presentation from Virgilio was an overview, risk- informing NMSS activities. That kind of had a lot of pieces and parts that maybe we discuss individually. But what are some of the thoughts of the committee about what we heard this morning? The committee is only three-fourths here. DR. APOSTOLAKIS: I think a general message that things are happening in various activities and that's good, but there is a need for, as you just said, for coordination, perhaps, common terminology, methods and so on. I think that's a general conclusion that deserves to be documented somewhere. There's certainly a lot of work that's going on. The only one -- if we decide to write a letter that gives an overview of everything we heard, I would limit it to a short letter. Now, the special nuclear materials, I think, deserves a separate meeting with all details on the ISA and so on. Maybe we can just put that in this letter. And I would propose that this become an ACNW letter with enhanced membership, as we agreed earlier. That would take forever. The special nuclear materials is of concern to me because they are sending the rule up next week and clearly we don't have time to have another meeting and everything. So I was wondering whether there was a mechanism alerting the Commission to the fact that maybe more detailed comments would be forthcoming. DR. GARRICK: One of the things we've already observed, it was kind of interesting, that's come out of these joint committee meetings, is that the energies of the advisory committees have been, for the most part, consumed in two areas, reactors and high level waste repositories. And in the meantime, there's a lot of other activities going on and it doesn't seem that they have had the same kind of review and advice on the risk-informed movement that these two extremes -- DR. APOSTOLAKIS: And I would put that also in the general comments. DR. GARRICK: Right, right. So that may make some suggestions about the planning and the activities of the committees. Have you got anything to say about that, John? MR. RANDALL: Yes. One of the things we discussed at the last ACNW meeting was the division of work in certain areas, like decommissioning. DR. GARRICK: Right. MR. RANDALL: And there are some other areas, like Part 70, 71, 72, which were originally included in an MOU. There is a draft MOU between the ACNW and the EDO, the ACRS and the EDO, and it lays out areas of coverage for particular committees. But there is some overlap and there's some overlap in Part 70 and we need to handle those, I think, on a case- by-case basis. DR. APOSTOLAKIS: That's a different issue. What we're saying here is that there were some activities that were not reviewed by -- MR. RANDALL: I know. DR. GARRICK: Yes. MR. RANDALL: There's some history to that, and I don't want to go into all that right now. DR. GARRICK: Yes. Maybe it was by design, but anyway. We have some real -- DR. APOSTOLAKIS: We're looking for more work. DR. GARRICK: -- catching up to do. MR. RANDALL: There are some activities which are picking up some areas in NMSS and fuel fabrication facilities, there's talk about handling an application for a Mox facility at the same time. NRR will be handling the licensing and utilization of that material. So there will be a division of responsibilities, which are already outlined in this MOU, to look at some of these things. Of course, they're unbudgeted, so that's a problem I have to deal with. Now, the sub-issue you raised if you feel -- if you have concerns about the staff going forward with this Part 70 rulemaking activity, I think it's something that we need to let the staff know either through the EDO or through the Commission. Usually, these things will go up to the EDO and then at some point get to the Commission, and the Commission may not act very quickly on this package. But we certainly can let the Commission know that the committee is interested in providing some comments. But I think the staff would probably like to get a heads-up on those types of comments in case they want to make any changes in the document, they have an opportunity to do that. DR. APOSTOLAKIS: I think we should have another meeting just on the special nuclear materials. MR. RANDALL: Marty, do you have a schedule for the Commission? Is it on a tracking system? MR. VIRGILIO: Yes. The paper is going up to the Commission, as Ted said, within the next week or so. It's due up to the EDO, I believe, on Monday and then the EDO gets a couple of days review and then it's due up to the Commission. The Commission meeting I think has been scheduled for the week of June 20, somewhere right around there. It's up to them how long it takes to decide on moving forward. One thing that we might want to consider is can you separate the concerns, the rule and the standard review plan, and the two categories. One way to proceed, and I think the rule is written at a fairly high level and you might find that that's an acceptable approach, I believe it is, listening to the concerns that were raised today, and allow us to continue to work together on the standard review plan through working through examples. That's an approach that I think would be a good approach. If it works for you, you would have to look at the rule and make sure you see the rule the way I do, in light of the issues that we've discussed today. DR. GARRICK: That's a good point. DR. APOSTOLAKIS: In any case, I don't think this committee -- do we need the reporter? MR. VIRGILIO: I don't think so, no. DR. GARRICK: No. [Whereupon, at 3:45 p.m., the meeting was concluded.]
Page Last Reviewed/Updated Tuesday, July 12, 2016
Page Last Reviewed/Updated Tuesday, July 12, 2016