United States Nuclear Regulatory Commission - Protecting People and the Environment
Home > NRC Library > Document Collections > ACNW > Meeting Transcripts > 1999 > 109th Meeting - May 11, 1999

109th ACNW Meeting U.S. Nuclear Regulatory Commission, May 11, 1999

                       UNITED STATES OF AMERICA
                     NUCLEAR REGULATORY COMMISSION
                                  ***
                  ADVISORY COMMITTEE ON NUCLEAR WASTE
                                  ***
                 MEETING:  109th ADVISORY COMMITTEE ON
                         NUCLEAR WASTE (ACNW)
     
     
                        U.S. Nuclear Regulatory Commission
                        Two White Flint North
                        Room T-2B3
                        11545 Rockville Pike
                        Rockville, Maryland
     
                        Tuesday, May 11, 1999
     
         The committee met, pursuant to notice, at 3:50 p.m.
     
     MEMBERS PRESENT:
         B. JOHN GARRICK, Chairman, ACNW
         GEORGE HORNBERGER, Member, ACNW
         CHARLES FAIRHURST, Member, ACNW
         RAY WYMER, Member, ACNW
     STAFF PRESENT:
         ANDREW C. CAMPBELL, ACNW 
         LYNN DEERING, ACNW 
         HOWARD J. LARSON, ACNW 
         RICHARD K. MAJOR, ACNW 
         JOHN SORENSEN, ACNW Fellow
     
     PARTICIPANTS:
         CHRISTIANA H. LUI, NMSS
         KEITH MCCONNELL, NMSS.                         P R O C E E D I N G S
                                                      [3:50 p.m.]
         CHAIRMAN GARRICK:  The meeting will now come to order.
         This is the first day of the 109th meeting of the Advisory
     Committee on Nuclear Waste.  My name is John Garrick, Chairman of the
     ACNW.
         Other members of the Committee include George Hornberger,
     Ray Wymer, and Charles Fairhurst.
         The entire meeting is open to the public.
         During today's meeting the committee has already met with
     the ACRS-ACNW Working Group on Risk-Informed Regulation, where we
     discussed a framework for risk-informed regulation in NRC's Office of
     Nuclear Materials, Safety and Safeguards, and we have also had some
     prior discussion on committee activities and future agenda items.
         What we want to do now is listen to a description by the
     Staff of the strategy for converting the issue resolution status reports
     for the proposed high level repository at Yucca Mountain into a review
     plan for the repository license application.
         Howard Larson is the Designated Federal Official for today's
     session.
         We are conducting the meeting in accordance with the
     provisions of the Federal Advisory Committee Act.
     We have received no written statements or requests to make oral
     statements from members of the public regarding today's session.
         Should anyone wish to address the Committee, please make
     your wishes known to one of the Committee Staff.  As usual, it is
     requested that each speaker use one of the microphones, identify
     themselves and speak with clarity and volume.
         Before proceeding with the first agenda item, there's a few
     items of current interest that we want to mention.  A couple of Staff
     issues -- Michelle Kelton and Ethel Barnard of the ACNW-ACRS Office
     received an Achievement Award for their contribution to the Y2K
     application renovation efforts at a March 12, 1999 awards ceremony.  We
     want to thank them for their efforts, and the others in the agency.
         We are told now that our computers are ready and that we may
     not have to unplug them after all.
         Mary Thomas's six-month rotational assignment has recently
     ended, and she has returned to the Office of Nuclear Regulatory
     Research.  Mary was a major player in our recent working group session
     on the effects of low-level ionizing radiation.
         The Nuclear Waste Technical Review Board issued its report
     on the viability assessment.  "Moving Beyond the Yucca Mountain
     Viability Assessment" was the title of the report.  The Board notes
     that, quote, "So far it has not identified any features or processes
     that would automatically disqualify the site but that DOE should give
     serious attention to alternatives to the VA reference design including
     changing from a high temperature design to a ventilated low temperature
     design below the boiling point of water."
         The Board also notes that DOE's plans to determine the
     suitability of the proposed repository by 2001 is, quote, "very
     ambitious and much work remains to be done."
         The House Commerce Committee approved the Nuclear Waste
     bill, H.R. 45, that will provide for interim storage of spent commercial
     power reactor fuel at Yucca Mountain, Nevada.  The bill passed on a 39
     to 6 vote and now moves to the House floor.
         In an April 16 order, a Federal Judge sided with the utility
     low-level waste generators and site developer, U.S. Ecology, in their
     lawsuit claiming political bias caused Nebraska regulators to deny a
     license for a disposal facility last year.  The Judge noted in the order
     that, quote, "There is good reason to think that a license denial was
     politically preordained."  The utility, U.S. Ecology, and the Central
     Interstate Low Level Waste Commission sued the state and its regulators
     last year, blaming politics for delays in the licensing process.  The
     licensing denial will be repealed, and in that connection --
         DR. HORNBERGER:  Appealed.
         DR. GARRICK:  I'm sorry, appealed -- appealed, yes.  Thank
     you.
         In connection with that, Nebraska's legislature has approved
     L.B. 530, a bill to remove the state from the Central Interstate Low
     Level Radioactive Waste Compact.  On May 6th the bill passed the third
     and final round of debate by a vote of 33 to 11 with 5 Senators excused
     and not voting, and it now proceeds to the Governor.  The Governor is
     expected to sign it into law, and if enacted, the legislation will take
     effect on August 29th.  As provided in the legislation, the Governor may
     then write to the Governors of the compacts under member states to
     notify them of Nebraska's withdrawal.
         Of course, under the terms of the compact agreement
     withdrawals generally do not take effect until five years from the date
     of such notification.
         One other item of interest, maybe two others.  County
     Commissioners in the Las Vegas area have made it very clear that they
     plan to fight the transportation routes chosen for moving radioactive
     waste through the Las Vegas area from DOE's Fernald site in Ohio.  This
     is of interest given the discussions we had earlier in the day on
     transportation.
         One other perhaps note of interest is that New Mexico
     Attorney-General, Patricia Smith, late last month pulled out of
     litigation challenging the Environmental Protection Agency certification
     of the Waste Isolation Pilot Plant's long-term disposal standards.  The
     standards are aimed at protecting public health and safety for 10,000
     years.  Madrid's motion to withdraw from the case was granted Wednesday
     by the U.S. Court of Appeals for the District of Columbia, which also
     decided to terminate oral arguments between the State of New Mexico and
     EPA that were scheduled for Thursday.  I think this is -- I guess this
     is dated May 10th, so I guess they are talking about this week.
         I think that is all the items of interest that we want to
     cover.
         Now we are going to turn to the Staff for a discussion of
     Yucca Mountain review plan and I guess Keith McConnell is going to kick
     it off and then introduce Christiana.
         DR. McCONNELL:  Thank you, Dr. Garrick, members of the
     committee.
         We are here today for the first of several interactions we
     intend to have with the committee as we develop the Yucca Mountain
     Review Plan.  Today's briefing is basically at the concept level,
     defining how we intend to approach the development of the Yucca Mountain
     Review Plan.  It doesn't get into specific details, although there is an
     example that Christiana has as backup.
         As we get into the details of specific issues such as
     defense-in-depth, how we intend to implement defense-in-depth in the
     Yucca Mountain Review Plan, we will be back to the committee as well as
     other details as we go along in this process and part of Christiana's
     presentation is a schedule, at least a proposed schedule, for all of the
     elements of the work.
         In fact, I think we are right now scheduled to talk to you
     briefly about defense-in-depth at the June meeting you intend to have in
     San Antonio.
         So with that I will turn it over to Christiana Liu, who is
     the lead for the development of the Yucca Mountain Review Plan.
         MS. LUI:  Thanks, Keith.  The title of my presentation today
     is "The Framework for the Yucca Mountain Review Plan."  This is a work
     in progress and we thought that as we proceed along the development of
     the review plan, we will come back to the committee to talk to you as we
     develop more and more details.  Today is the part one of the series.
         I am Christiana Liu, and I work for Keith McConnell in the
     High Level Waste and Performance Assessment Branch in the Division of
     Waste Management in the Office of NMSS.
         Basically, there will be pretty much three parts to my
     presentation today.  The first four bullets cover part one, which I will
     give you a brief introduction to the framework for the Yucca Mountain
     Review Plan before we jump into the more detailed part of how we are
     intending on integrating the material we have published in the IRSRs
     into the Yucca Mountain Review Plan.
         And the last part of the presentation, I will conclude with
     the advantages of the approach that we intend to implement and, also,
     like Keith has mentioned, a schedule for the proposed work.
         There are basically four very high level principles before
     we jump into the framework.  The staff is principally responsible to
     defend the conclusion of our review of any potential license application
     for the Yucca Mountain site, and DOE, who is the licensee, is
     responsible to make sure that an adequate case is made in the license
     application.
         We have recently published a proposed performance-based,
     site-specific rule, Part 63.  In the out for public comments, basically,
     we thought that a performance-based, site-specific rule should be
     accompanied by a performance-based, site-specific review plan, and the
     focus of this review plan is for NRC staff evaluation of DOE safety
     case, including how the site characterization and experimental work has
     been conducted to support DOE's safety case.  You are going to see later
     on with more detail of what we mean by performance-based and
     site-specific approach, which is basically a top-down approach to
     include all the work that has been conducted.
         The strategy for licensing the Yucca Mountain site has been
     published in SECY-97-300.  That was the strategy paper prior to the
     staff's work on Part 63, and in that particular SECY paper, the staff
     talks about how we intend to develop the Yucca Mountain Review Plan and
     the work here basically reflects that particular strategy.
         And the last point is the review should really be done in an
     integrated fashion, that integration should take place as at the
     technical staff level.  Again, this is speaking to a top-down approach,
     recognizing that there will be a natural tension between implementation
     of a performance-based rule and the need to prepare and guide the staff
     in performing the review and writing the final Safety Evaluation Report.
         The Yucca Mountain Review Plan should be formulated based on
     the staff's current understanding of DOE's approach and all the
     iterative performance assessment work that the staff has been doing in
     the past decade to build up our own capability.
         Given that, the framework should be sufficiently flexibility
     to accommodate changes in DOE's approach.  In all works in the
     performance-based approach, the licensee is given the flexibility in
     designing how they will address their safety case and the staff should
     be prepared to be flexible in accommodating whatever approach DOE has
     decided to use in their license application.
         Next, I will turn quickly to the features of the review
     plan.  As in all the other review plans, they are five standard
     components, the areas of review that basically describe the scope of the
     review, in other words, what is being reviewed.
         Acceptance criteria delineates the criteria that can be
     applied by the reviewer to determine the acceptability of the compliance
     demonstration.
         Review procedure discusses the appropriate review technique
     to determine whether the acceptance criteria have been met.
         Evaluation findings basically presents the general
     conclusions and finding based -- resulting form the staff's review, and
     that will be the material that will we make into the Safety Evaluation
     Report.
         Finally, references, that will list any applicable
     references that the staff has used in its review and the review plan.
         The next part is a very high level outline of what the final
     Yucca Mountain Review Plan will most likely look like, abstract,
     executive summary, introduction, and there are basically three elements
     in the introduction that we would like to cover.
         Part I is basically the principles in formulating this
     performance-based review plan that will give the background that I have
     presented to you at the beginning of this talk.
         And structure of progression of NRC high level waste
     program, basically, this will describe how we have used the KTIs for
     pre-licensing consultation and issue resolution and how we are
     transitioning from the KTI approach into an integrated team approach for
     reviewing the license application.
         And the third part, we will attempt to provide a clear
     relationship between how the Yucca Mountain Review -- the Yucca Mountain
     license application is going to be reviewed and in what context the
     requirements under paragraph 63.21, that is the content of license
     application, are to be reviewed.
         Chapter 1, review of general information, which is basically
     a requirement in the proposed Part 63, paragraph 63.21(b).
         What I am going to spend more time in the subsequent part of
     this presentation is really to talk to you about Chapter 2, which is the
     Safety Analysis Report.  And there are three components to that Safety
     Analysis Report Review, the pre-closure part, the post-closure part and
     the administrative and programmatic requirements.
         The IRSR work has been done, mostly on the post-closure part
     which is basically Chapter II.B on this outline.  We may come back to
     this outline later on as we go on with this particular presentation.
         The next three pages basically gives you more information of
     what Chapters II.A, II.B and II.C will look like.  Chapter II.A and II.B
     are organized by performance objectives and the associated technical
     criteria.
         Under areas of review for II.A, the compliance demonstration
     to meet the pre-closure performance objective and the requirements for
     an ISA and Subpart F, and the review chapters are basically formulated
     based on the pre-closure performance objectives.
         Basically, we have required DOE to use an integrated safety
     analysis to demonstrate the pre-closure safety during operation and for
     design basis events categories 1 and 2.
         Another performance objective for pre-closure is
     retrievability plan and alternate storage.
         And the last objective for pre-closure performance -- for
     pre-closure portion is the performance confirmation program.
         For each of the review chapters, we will clearly identify
     what part of the license application is to be reviewed.  For example,
     taking Chapter II.A.2, the retrievability plan and alternate storage,
     the content of Yucca Mountain license application to be reviewed will be
     in paragraph 63.21(c)(19), it is the retrieval and alternate storage
     plans.
         And evaluation findings basically is to conclude, after
     reviewing all the parts in the license application as they have been
     specified here, whether DOE has met the pre-closure performance
     objectives and the technical requirements associated with it.  If they
     have, then the staff's -- the conclusion that will go into the Safety
     Evaluation Report is that DOE has successfully demonstrated they have
     met the pre-closure performance objective.
         Because of the parallelism that has been building to the
     proposed Part 63, the pre-closure and post-closure have similar
     approaches.  On the next page, page 6, II.B, repository safety after
     permanent closure, again, the areas of review here is the compliance
     demonstration.  To me, the post-closure performance objectives which are
     delineated in paragraph 63.113 and the technical requirements for doing
     a performance assessment, the technical requirements for critical group
     and, again, performance confirmation program.
         And the review chapters are performance assessment and
     performance confirmation.  Because we fully expect DOE to use
     performance assessment to demonstrate that they have met the
     post-closure performance objectives, namely, the mean peak dose will not
     be greater than 25 millirem per year, and also the multiple barrier
     requirement, and the license application that will be reviewed to
     determine if DOE has successfully demonstrated post-closure performance
     in the performance assessment are identified under Chapter II.B.1.
         We will review site description, the material that has been
     used in construction, especially for underground tunnel, the EBS design,
     of course, performance assessment itself, the stylized human intrusion
     analysis, use of expert elicitations, and there are probably also other
     parts that I have not listed here.  But as we work on the development of
     this review plan, all the different parts and detail will be fleshed
     out.
         Again, the evaluation findings is to determine if DOE has
     successfully demonstrated they have met the post-closure objectives
     outlined in paragraph 63.113 and met the technical requirements in
     63.114 and 63.115, and also the performance confirmation requirement.
         Administrative and programmatic requirements.  Right now,
     these are -- these contain Subpart D, that is the records, reports,
     tests and inspections; Subpart G, quality assurance; and Subpart H,
     training and certification of personnel.  For each of those subparts,
     there will be a review chapter associated with it and the evaluation
     findings is to determine if DOE has demonstrated they have met the
     requirements under these subparts.
         One thing that I would like to mention is existing guidance,
     such as Regulatory Guides, NUREGs and the other Standard Review Plans
     will be used or modified to the extent applicable.
         In terms of when we develop the Yucca Mountain Review Plan. 
     In other words if there is no need to reinvent the wheel, we will not do
     that, especially for the administrative and programmatic requirements
     and also some of the preclosure portions.
         Page 8 and page 9 basically gives you a preliminary idea of
     what part of the license application is going to be reviewed where. 
     Since each of the chapters will contain the acceptance criteria and
     review methods, these will clearly be for guidance to the Staff and
     indirectly provide information for the DOE on what we expect to see, how
     we are going to determine what is acceptable or what is not acceptable.
         This particular linkage may change as we finalize Part 63
     and we further develop the Yucca Mountain Review Plan.
         One thing I would like to mention, that as we are going
     through this more or less systematic process we found that some of the
     entries currently under 63.21 will probably be modified.  For example,
     the use of expert elicitation.  The Part 63 right now only requires DOE
     to supply its use in the post-closure portion and we know that DOE is
     using or is planning on using expert elicitation for some of their
     design work for the preclosure part, such as probabilistic seismic
     hazard analysis, so we will modify 63.21 to require DOE to supply
     information for expert elicitation use for both the pre and post closure
     part.
         There may be some redundancy in 63.21 and we will most
     likely consolidate some of the requirements without changing the essence
     of what is currently under 63.21.  Some of the requirements under 63.21
     are really technical requirements rather than content-related
     requirements, so those technical requirements will be moved to the
     appropriate technical requirement portion under subpart (e) leaving
     63.21 strictly content.
         Finally, the sequence may be rearranged to reflect a more
     logical structure in the final rule.
         This basically concludes Part 1 of the presentation on the
     framework of the Yucca Mountain Review Plan.  Now I would like to
     provide you more information on how we are approaching it in terms of
     integrating the IRSRs into the Yucca Mountain Review Plan.  You may want
     to keep in mind that for the post-closure part, the outline is on page
     6.
         DR. GARRICK:  Maybe the committee would like to ask some
     questions before you make the transition.
         MS. LUI:  Okay.
         DR. GARRICK:  One of the things that I was trying to track
     here was in the framework for the, in the framework for the Review Plan
     was how you are going to use some of your tools and what those tools are
     and this table kind of gets in it, but it seems to have apples and
     oranges, as you kind of explained.
         It has methods of analysis and it has physical features.
         MS. LUI:  Right.
         DR. GARRICK:  For example, the site is a physical feature
     and the performance assessment and integrated safety analysis are
     methods of analysis.  I guess site description could be considered.  I
     was just having a little trouble trying to figure out what you were
     attempting to do here.
         MS. LUI:  Okay.  I believe we are on Slide Number 8, is that
     correct?
         DR. GARRICK:  Yes.
         MS. LUI:  Okay.  The left-hand side of the first column is
     really the very abbreviated description of what is currently in
     paragraph 63.21, the content of license application.
         Again what is currently under 63.21 is more or less a flat
     structure, so yes, you are correct, Dr. Garrick.  For performance
     assessment in order to really evaluate a complete performance assessment
     there are a lot of other pieces that will need to be used to support
     that performance assessment but they are currently laid out in the
     linear structure rather than in a hierarchy structure.
         One of the possible things that might happen when we go
     towards finalizing Part 63 is to organize the content of application in
     a more logical fashion that would correspond to, more or less correspond
     to our review process.
         DR. GARRICK:  Okay.  The other thing I was looking for as
     how your TPA, now the NRC's TPA really enters into the framework, into
     the review process.  You do talk about performance assessment with
     respect to permanent closure but again I was trying to figure out how
     the tools you have are going to be employed in the review process.
         MS. LUI:  Okay.  You will hear more on that during Part 2 of
     this presentation.
         DR. GARRICK:  Oh, okay.
         MS. LUI:  But one of the immediate results that we have used
     in terms of applying TPA code is to help us focus on what we need to
     focus in terms of reviewing the performance assessment.
         Basically I think I mentioned this when I talked about the
     principles is that the Staff's IPE effort has been used in terms of
     formulating this particular performance-based approach but if you are
     looking for a specific use of TPA code we'll talk about it a little bit
     later.
         DR. GARRICK:  Okay, well, yes.  I think sooner or later we
     want to learn a little more about the Independent Safety Analysis
     process too.  I don't know whether we are going to get into that here or
     not but that has come up in other issues that we have discussed today
     and we have got input coming on that, so we can assume we will learn
     more about that later.
         DR. McCONNELL:  Yes.  The preclosure -- this is Keigh
     McConnell -- the preclosure safety case review plan is lagging behind
     the postclosure because we have over the past couple years focused on
     the postclosure capability within the Staff and also that is the focus
     of the KTIs, but we are now developing our concept towards developing an
     ISA and what would be in a review plan that we would expect for DOE,
     what we would use to gauge an acceptable ISA that DOE would submit, so I
     guess what I am trying to say is we're a little bit earlier in the
     process with that part of the Review Plan than we are with the
     postclosure.
         DR. GARRICK:  Right.
         DR. McCONNELL:  So this briefing is just going to focus on
     the postclosure.
         DR. GARRICK:  Okay.  Any other comments?
         [No response.]
         MS. LUI:  Okay.  Now I'll move on to the second part.
         To focus on how we are integrating all the IRSRs into the
     Review Plan.  Again, just to reiterate the strategy, SECY-97-300
     describes staff strategy in developing the proposed Part 63 and Yucca
     Mountain review plan.
         You will find on your next page, that's page 11, the
     flowdown diagram that we have included with the strategy paper.  And
     integration of all the IRSRs for the postclosure work is really taking
     place at the third tier of this flowdown diagram.
         Up until this point the total system performance assessment
     integration IRSR uses this particular framework that you see on page 11
     here to basically set up the IRSR material, and all the KTIs have done
     the crosswalk to the lowest tier of this particular flowdown diagram to
     identify and indicate how the subissues are contributing to the total
     system performance assessment.
         From this point on to avoid duplication and keep a
     consistent set of acceptance criteria and review methods, starting from
     FY2000, all the acceptance criteria and review methods will be in the
     Yucca Mountain Review Plan, and integration if we intend to use this
     lowest tier as the integration tool to help us systematically integrate
     all the information from the various key technical issues and the plan.
         The status of issue resolution will continue to be
     documented in the IRSR.
         DR. McCONNELL:  If I could, Christiana, this is where we are
     implementing the TPA effort into the Yucca Mountain review plan.  It
     says this lower tier on this diagram, which you should be pretty
     familiar with, is where we used the information from our reviews of DOE
     documents as well as our own IPA efforts to identify those elements of
     the performance assessment that are most important.  So here's how we're
     integrating the TPA effort into this process.  And you'll see more of it
     as Christiana goes on.
         MS. LUI:  Well, actually, Keith, you have jumped ahead --
         DR. McCONNELL:  Sorry.
         MS. LUI:  To Slide Number 12.
         DR. McCONNELL:  Okay.
         MS. LUI:  Okay.  The bottom tier here, we have retermed them
     as integrated subissues.  It seems everybody's a lot more comfortable in
     terms of talking about subissues.  You may also know the lowest tier as
     key elements of subsystem abstractions or KESAs.  But in the review plan
     we will not name them KESAs, rather, we would like to call them
     integrated subissues, because here it is really integrating all the
     efforts that have been taking place by the staff up until this point.
         And what these integrated subissues are, they are really
     developed from a top-down approach, and developed based on as Keith has
     indicated review of DOE's TSPAs, knowledge of current design options and
     site characteristics, and the staff's IPA work, which basically relies
     upon TPA a lot.
         And they are integrated processes, features, and events that
     could impact system performance.  This particular framework provides KTI
     an integration tool to describe their contribution in the context of
     total system performance assessment, and also helps us to facilitate
     integration at the technical staff level.
         For example, many KTIs require interaction with other KTIs
     in evaluating repository performance such as for waste package
     corrosion, which is basically the very first box on the lowest tier on
     page 11 of the flowdown diagram.
         The current KTI division is not -- the waste package
     corrosion issue cannot just be addressed by any one single KTI.  It's a
     concerted effort between thermal effects on flow, the near field
     environment, the container lifetime and source term, and of course we
     use the TPA code under the TSPA KTI to help us evaluate the waste
     package corrosion issue.
         The next page, after giving you some background information
     now I would like to give you some more detail in terms of the
     performance assessment review, which is basically Chapter 2(b)(1) under
     the outline for the Yucca Mountain Review Plan.
         In this particular chapter we have basically four subgroups. 
     This is some description and demonstration of multiple barriers,
     scenario analysis, model abstraction, and demonstration of the overall
     performance objectives.
         We fully expect that DOE will do analysis to show how they
     are taking credit for the various features, events and processes or a
     combination of them to satisfy the multiple barriers requirement,
     therefore upfront we would like them to summarize what they have done,
     what they are taking credit for in terms of demonstrating the multiple
     barriers requirement.  That will also help the Staff to focus on our
     review in terms of what DOE is relying upon.
         The next step is scenario analysis.  Basically here we will
     expect DOE to clearly identify what they have taken into consideration
     in their post-closure safety case and what they have excluded and what
     are the technical bases for inclusion or exclusion.  In other words,
     this part will set up the boundary condition for the performance
     assessment.
         After that part is done, then we look at the model
     abstraction, what has really gone into the calculations and here is 14
     integrated subissues that we have shown on the lowest tier of the
     flow-down diagram.  We basically help the Staff in focusing our effort
     and focusing and integrating the information that we have published in
     the IRSRs.
         Lastly, after reviewing all the previous three parts, we
     will make a determination to see if DOE has truly demonstrated they have
     met the overall performance objectives in terms of dose requirement and
     the multiple barriers requirement.
         Here is also the place where we will be developing
     acceptance criteria review methods in terms of the transparency and
     traceability of DOE's analysis.
         I know this is a very, very busy slide but I just wanted to
     keep everything together rather than have you flip back and forth.
         DR. GARRICK:  I understand.  I like to do that once in
     awhile myself.  I am just trying to interpret it in terms of, say, the
     multiple barriers and where you really address those.
         MS. LUI:  The multiple barriers would be the very first
     group.  There are three technical requirements associated with multiple
     barriers.  The part is to identify, is to require DOE to identify what
     barriers they are taking credit for and the second part is to provide
     the technical basis, and then -- sorry, I forgot what's the third part.
     Tim?
         MR. McCARTIN:  The middle one is the capability and the last
     one is the basis for the capability.
         MS. LUI:  So basically the technical requirements are laid
     out in the proposed Part 63.  What we will do in the Review Plan is to
     develop acceptance criteria and review procedures and most likely we
     will also provide guidance in terms of what DOE can do to demonstrate
     multiple barriers.
         DR. GARRICK:  Okay.
         MS. LUI:  Okay?  The next three pages basically highlights
     what I have talked briefly about five or 10 minutes ago is the crosswalk
     that all the KTIs have done to relate their subissues to the integrated
     subissues.  These are the existing relevant KTI subissues.
         As we go through this process ourselves, we may find that
     there will be knowledge and expertise needed to bridge the gap in the
     integrated subissues that currently do not have a KTI subissue
     associated with it.  This will help us to focus our efforts by using a
     top-down approach.
         You probably have seen this particular crosswalk in one form
     or the other when you received the IRSRs from us, because all the KTIs
     have basically done this particular crosswalk and here it is just a
     summary of all the information that had been published previously.
         The next two pages, page 17 and 18, as I have mentioned
     previously because we are trying to implement this performance-based
     approach we need to provide the licensees the flexibility in terms of
     how the licensees intend to address their safety case.
         In the viability assessment, in TSPA VA DOE basically has
     identified 19 principal factors that are of greatest importance to
     post-closure performance.  I just want to demonstrate that the
     integrated subissue approach will allow us to review DOE's safety case
     that's built on these 19 principal factors.
         Even though there is not necessarily one-to-one
     correspondence, we do have every single principal factor covered in the
     proposed approach for the Yucca Mountain Review Plan.
         DR. McCONNELL:  Can I just add that we intend to work with
     the Department of Energy over the next couple of months in trying to
     come closer together perhaps on matching their elements with our
     integrated subissues.  In fact, we have talked to them briefly and they
     seem to be willing to do that, so there may be more coalescence as we
     move on.
         DR. GARRICK:  But for now they have all been accounted for.
         MS. LUI:  Yes.  I would like to point out that the 19
     principle factors do not address disruptive events, but in DOE safety
     strategy, they do have attributes that talk to disruptive events and we
     do have integrated subissues that will review the disruptive events
     process such as mechanical disruption or waste packages and igneous
     activity integrated sub-issues.  So even though you don't see the
     disruptive events here, they will be included.
         And, finally, the third part, advantages of the approach. 
     Review of both the pre-closure and post-closure safety cases are
     performance-based because from the very beginning we are set out to
     determine if DOE has met the performance objectives or not, and we use
     the top-down approach to evaluate the license application.  It will
     encompass all the related activities such as site characterization,
     experimental work and all the way up to performance assessment.
         By using the top-down approach, the iterative cycle of
     performance assessment, data collection is clearly and closely
     maintained.  And in this particular framework we can clearly indicate
     why DOE's supporting data is acceptable or deficient in the context of
     how that piece of information has been used to support DOE's safety
     case.
         And as we go through this particular process, as we go
     through the integration of all the acceptance criteria and review
     methods currently in the IRSRs, we will attempt to minimize any
     duplication and also modify or eliminate, possibly -- I would like to
     emphasize the word possibly, overly prescriptive acceptance criteria
     currently out in the IRSRs.  And the requirements under the content of
     license application, which is paragraph 63.21, and the requirement for
     addition information, RAIs, are clearly justified in this particular
     context.
         And we certainly -- and we hope by using, by implementing
     this particular framework, it will lead to a streamlined, transparent
     and integrated review plan.
         And on the last page are the seven activities that we have
     identified related to the development of Yucca Mountain Review Plan that
     will be of interest to the committee.  We are planning on a technical
     exchange in the last week of May to talk about the Yucca Mountain Review
     Plan with DOE and our attempt is to get understanding from them, and
     also to work on possibly having a consistent approach between the review
     plan, between the license application and also leading to a consistent
     Safety Evaluation Report at the end.
         Currently, the staff is working on Revision 2 of the IRSRs
     and they are -- all of them will be schedule to go out of the NRC by
     September 30th, 1999.  And the materials in the IRSRs will be
     appropriately integrated into the Yucca Mountain Review Plan, or be
     referenced by the review plan.  The acceptance criteria and the review
     methods will certainly be integrated into the review plan.  However,
     some of the technical basis section will probably be left in the IRSR
     and will be referenced by the Yucca Mountain Review Plan.
         Activities 3, 4 and 5 are inter-related.  We are currently
     scheduled to provide to the Commission the final Part 63 package and
     also an annotated outline of the Yucca Mountain Review Plan by November
     30th, 1999, however, there is a possible postponement to February 2000
     which are currently being talked about.
         But in any case, once we have submitted to the Commission
     the final rule package and after the Commission has approved that, we
     intend to have public meetings with the state and county and also
     interactions with DOE to present and clarify the final Part 63 and also
     the accompanying Yucca Mountain Review Plan.
         The Rev. 0 of Yucca Mountain Review Plan is currently
     scheduled to go out on March 31st, 2000, but be aware that, as Keith has
     mentioned, the pre-closure part is not as mature as the post-closure
     part at this point.  So in the Rev. 0 Yucca Mountain Review Plan, it
     will contain some "to be determined," or "to be developed" sections. 
     But as we proceed along the line, those sections will be filled in.
         And we will publish future revisions of the Yucca Mountain
     Review Plan before the key DOE milestones such as site recommendation
     and license application.  And during this entire process, we will come
     back to the committee to brief you on our progress and seek advice as
     appropriate.
         This basically concludes the formal part of my presentation.
         DR. McCONNELL:  Okay.  What Christiana does have at the back
     is some backup slides to kind of go through one of the integrated
     subissues and defines how we are starting to bring in the acceptance
     criteria from the various KTIs into the integrated subissues, and I'm
     not sure if you want to go through that in this forum or not or just
     have it for your reference.
         DR. GARRICK:  Well, I think for now we'll use it as
     reference.
         One of the things that the Committee has been anxious to
     resolve, and this begins to address that, is the interrelationship of
     the KTIs and the TPA, and the ability for us to be convinced of what's
     really driving what.  And I suppose that we're going to have to, you
     know, see some detail in order to understand that.
         But the one concern we have had is that the KTIs would
     assume a sort of a sacredness of level with respect to being an issue,
     and whether or not the iterations of the TPA would indeed be able to map
     to the KTIs in a way that would importance rank them, for example, and
     keep their importance in some sort of order in relation to a risk
     perspective, which kind of brings me to some overarching observation. 
     You do put a lot of effort in this presentation with respect to it being
     performance-based.  Not much is said about it being risk-informed except
     indirectly as it relates to the performance assessment.
         And I guess I'm just raising the question, is the risk
     perspective really in the review plan?
         MS. LUI:  I believe it is, because we have put a lot of
     focus in terms of what's the expected evolution of the repository.
         DR. GARRICK:  Um-hum.
         MS. LUI:  Basically staff's position is this is what we
     anticipate to happen sooner or later.  So in that perspective the
     probability is close to 1, except for the disruptive events which we do
     have screening criteria set up in Part 63 to talk about the probability
     part.
         DR. GARRICK:  Um-hum.
         DR. McCONNELL:  I'd just add that implicitly -- I think what
     Christiana's pointed out is implicitly it is in what you've seen today,
     and it's explicitly -- implicitly in the sense that in the integrated
     subissues basically the approach we've been taking in evaluating DOE
     submittals like the VA was to look at the integrated subissues, look at
     their contribution to the risk in the sense of the PA, in the PA
     calculation, and use that as a guide for where we would concentrate our
     review.  So in our own sensitivity studies where we look at the risk
     sensitivity of the various integrated subissues, we've used that to
     focus it so implicitly it's in this diagram and in the review plan.
         We are I think now embarking -- and the KTIs are embarking
     on a path where they are using risk-informed to look at their own
     subissues within a KTI, and that then will flow up.  This is one case
     where things will flow up into the PA where we will concentrate our
     effort when we look at our own code and also when we look at perhaps the
     identification of other integrated subissues.  So I guess the message
     I'm trying to say is it's implicitly in here, and as we go along this
     process we'll probably do more to explicitly bring it into the review
     plan.  And I think you'll see that.
         DR. GARRICK:  Well, maybe the other Committee Members can
     see it more clearly than I.  I'm still struggling with trying to
     understand how the risk perspective really is a management tool for the
     review process.  You talk a lot about a top-down approach.  My idea of a
     top-down approach would be a series of scenarios that would characterize
     the risk of the repository and then a ranking of those scenarios in
     terms of their contribution to the overall risk, and then the fallout
     from that of these items in these lower boxes in terms of their
     contribution to first the scenario, the risk of that particular
     scenario, and, second, the risk of the aggregation of scenarios.
         I'm still looking for something that gives me a comfortable
     feeling that there's a real genuineness here in the implementation of a
     risk approach to determining the performance, the safety performance of
     the repository.  And I hear and see a lot of things that look like
     you're trying to do that.
         DR. GARRICK:  But as I say, it is not sufficiently
     transparent to me yet to really be convinced that that kind of thought
     process prevails, and I yield to the other members to comment.
         DR. HORNBERGER:  Well, I am just curious whether or not
     except for disruptive events we have a scenario driven approach.
         DR. McCONNELL:  Well, I think we do, but maybe Tim can --
         MR. McCARTIN:  Well, I don't know if this will scratch the
     itch or not, but let me try.
         I think what Keith and Christiana have both said, I mean the
     KESAs or those subissues are the areas that we believe, based on our
     analyses of the site with TPA as well as all the KTIs, the information
     they have had, that need to be addressed, the risk part of it gets
     tricky because DOE is the one that is going to demonstrate performance
     and they have a variety of ways to deal with those subject areas.
         There could be some of those subject areas -- we are going
     to do a bounding analysis here or use a bounding parameter because it
     doesn't have much impact on the repository performance but that is their
     demonstration and we have to -- they have that flexibility in looking at
     the risk, with the rigor that they will attach and the depth of detail
     and that part, other than they have to address those issues, how they
     address them I think is getting to your risk part, and it is their
     demonstration, and our review will have to be flexible enough to, well,
     if it is a bounding analysis there is one way we will look at the
     information versus detailed information, et cetera, and hopefully we
     will try to capture that in the acceptance criteria.
         DR. McCONNELL:  But we do try to use a risk basis for
     judging where we place the emphasis in our reviews and that is based on
     our own sensitivity studies, notwithstanding the fact that DOE does have
     the responsibility for making the safety case.
         DR. HORNBERGER:  No, I mean I think that is -- that much is
     clear to me and I think that you are doing exactly what you said.
         I was referring to, I was trying to get my hands around what
     John suggested in a risk approach, which was a down-down approach where
     you started by defining a whole suite of scenarios that would lead to a
     failure, if you like, and I am not sure that we have that.
         I think that as Christiana said, this is going to happen and
     we have sort of one scenario and the scenario is that it is going to
     happen, that we are going to get this solution and migration and
     transport and everything else.  Well, correct me if I am wrong.
         MR. McCARTIN:  Well, DOE will have the job of pulling
     together all that.  We will review that for completeness but I don't
     know --
         DR. HORNBERGER:  I'll rephrase it.  From what you have seen
     DOE do so far, do you anticipate having a scenario-driven approach to
     review?
         MR. McCARTIN:  Yes.
         Okay, and give me some examples of the suites of scenarios
     that you anticipate, excluding disruptive events.  I understand that
     one.
     @@  MR. McCARTIN:  Excluding disruptive events?  Oh, you mean a
     base case that is primarily -- has more than one scenario?  Well, to
     date, I would say the base case that is being analyzed has the
     uncertainty being involved be it seepage, be it the amount of fracture
     flow, is somewhat encompassed in the uncertainty of that base case.  Now
     I guess if you want to call that one scenario, I suspect that --
         DR. HORNBERGER:  I'm just trying to understand.  I don't
     mean to argue with you.
         MR. McCARTIN:  Right, right.
         DR. HORNBERGER:  I am just trying to see if I understood
     John's question and your response in terms of it and I understand what
     you are saying.
         MS. LUI:  I think the closest that we have come to in terms
     of separating out scenarios probably would be in the area of alternative
     modelling approach, because there are various -- well, various
     approaches in looking at for example how a waste package could fail.
         In our Review Plan we will talk about, we will require DOE
     to look at all the credible alternative approaches but I don't believe
     we will ever get to a point of assigning probabilities to those models. 
     The best we can do is to look at the technical basis that has been
     provided with those alternative approaches and look at which particular
     approach DOE has decided to use in supporting their safety case and the
     Staff's evaluation will be looking at how defensible that would be when
     we do our safety evaluation.
         I think that's probably what Dr. Hornberger was driving at.
         DOE at one time was going to give us everything together. 
     In other words, they were not going to separate out if they have two or
     three different model approaches.  They were just going to combine
     everything and give us the final results altogether and we have stressed
     to them over and over that we want to see the results being presented
     separately and also the technical basis that had been used to support
     them, so we will see that in the Review Plan.
         DR. McCONNELL:  Where do stand on answering your question,
     Dr. Garrick?
         DR. GARRICK:  Well, all of it is somewhat helpful.  I am
     sure it is a question I am going to continue to struggle with for awhile
     and we will discuss it in future sessions.
         I think that one of the problems is that I have a certain
     notion of what constitutes a risk approach and I am having to back away
     from that considerably to appreciate fully what you are doing here, and
     it is going to take a little time to be convinced that the mapping that
     you have done is indeed done from the point of view of the right logic
     engine being in control, and the logic engine that I am most interested
     in is the one that is driven by what is most important with respect to
     the performance parameter.
         In this case it will be a dose standard of some sort, so I
     think that what you have attempted to do here in decomposing the KTIs
     into subissues and matching them up with both your own and DOE's
     descriptions is an important step.
         And it's part of the overall puzzle.  Whether or not it's
     enough to really let the important contributors drive the process, I
     don't know, I'm going to have to convince myself of that.
         DR. McCONNELL:  Okay.  And in succeeding briefings we'll try
     to emphasize that.
         DR. GARRICK:  Charles?
         DR. FAIRHURST:  Yes, it's quite a lot to swallow at one
     time.  Let me ask a separate question almost.  I'm particularly
     interested in what factors control and what certainties there are,
     seepage into the drift.  Nothing going beyond, but if I wanted to know,
     for example, infiltration rates, your model for fracture flow, matrix
     flow, et cetera, thermal-mechanical effects around the excavation.  It
     would help an awful lot in understanding the approach to the engineered
     barriers if one could first understand that subset of the problem.
         Would you have through this, if I were to ask you, would you
     have an independent way of presenting, you know, a response to that
     question without waiting for DOE to give you sort of its read on it and
     then what you're going to do is to say it looks reasonable to us?
         I mean, how independent is your assessment going to be? 
     I've got a lot of other questions besides, but I don't know whether I've
     indicated clearly.  It would help me if NRC rather than just DOE could
     give some input to that earlier on rather than later.
         MS. LUI:  Okay.  I think I've mentioned that.  We all
     recognize that there's a natural tension in between who's leading whom
     in a way in the performance-based -- I mean risk-informed approach.  Our
     job is really not to do licensees' work for them.  Rather, our job right
     now is to basically develop staff's capability in reviewing what DOE
     might come in.  So in our TPA code, that's sort of our attempt to
     utilize what we know, and staff's relatively independent thinking in
     what's required to be in the assessment so that you'll be defensible to
     help us down that path.
         DR. McCONNELL:  So I think the bottom line is we have the
     capability, but it's right now in the existing TPA code to look at that
     particular factor.
         DR. FAIRHURST:  That's right.
         DR. McCONNELL:  But we're waiting for DOE to demonstrate
     what they intend to do, and then we'll review that using this capability
     that we have.  If that isn't too jargonistic or bureaucratic.
         MS. LUI:  And all the interactions that take place between
     the staff and DOE is our attempt to learn as quickly as possible what's
     DOE thinking and what's DOE's approach, and develop our own toolbox in
     order to be ready.
         DR. FAIRHURST:  Yes.
         MS. LUI:  For that job.
         DR. McCONNELL:  To give you another example, I think we have
     the capability, and Tim probably could correct me if I'm wrong, to mimic
     things like drip shields and other factors that might be important in
     water contacting waste or the waste package or things like that.  So we
     have the capability, and if DOE tends to go that way, then we will
     probably move our program in that direction, so we'll be in a position
     to review it.  But we're not going to be probably out ahead of DOE on
     these particular issues.
         DR. FAIRHURST:  No, I agree, not out ahead, but at least
     giving some insights into weights or various -- I don't want to call
     them risks or what, but just where the potential is.  For example, to
     reduce uncertainties through engineered barriers or through similar --
     just to get an idea of what level, for example, of infiltration would be
     such that from that point on wouldn't likely violate, you know, the
     regulation or standard.
         DR. McCONNELL:  I think we have that capability.  Maybe we
     need to demonstrate what we have.
         DR. FAIRHURST:  I would like to see something indicating
     what that is, for that specific part of it, because I think that from
     that point on if, for example, you can show that you can avoid a
     critical rate of inflow, then things beyond that become a little less --
     I'm not saying that they're not important, but that becomes a particular
     cutoff.
         Let me ask another couple questions.  At some point -- right
     now in a probably very constructive way you're having a dialogue with
     DOE about how they're moving forward and so on, but at what point, and I
     also see that there's a mention of 2001, I thought it was 2002 that
     you're going to get the license --
         MS. LUI:  Okay.  Are you talking about the last entry on the
     schedule?
         DR. FAIRHURST:  I'm talking about safety -- I'm talking
     about license application as you'd get it in 2002, right?
         MS. LUI:  Right.  Right.
         DR. FAIRHURST:  You didn't put 2001.  I heard it earlier as
     a --
         MS. LUI:  Okay, 2000 -- okay, September 30, 2001, that's
     when we plan to publish Rev. 2 of the Yucca Mountain Review Plan. 
     That's about five or six months ahead of the expected license
     application.
         DR. FAIRHURST:  Okay.  So it's the review plan that you're
     going to concentrate on.
         MS. LUI:  Right.
         DR. FAIRHURST:  Okay.
         MS. LUI:  Right.
         DR. FAIRHURST:  And at what point before that do you feel
     that you have to sort of cut off dialogue with them so that -- at some
     point they have to move independently from you, present something that
     you are going to then independently review; right?
         DR. McCONNELL:  Yes, I think that -- I think our position is
     we're doing that now, that they have to do their work to demonstrate
     safety.  We're doing our work to build our capability to have the
     independent review there.  But this can't proceed without some
     interaction.
         DR. FAIRHURST:  Right.  That's what I'm getting at.
         DR. McCONNELL:  And that's where we are.  And I don't think
     at least, I could be corrected, I don't think we anticipate cutting off
     the prelicensing consultations at any point right now.
         DR. FAIRHURST:  Okay.
         DR. GARRICK:  Ray.
         DR. WYMER:  I've been waiting to see some of the criteria
     that you use to evaluate the license application sharply enough drawn
     that anybody who looks at them can decide exactly what they have to do
     in order to satisfy the criteria.  I presume they'll come sort of in a
     general sense from the KTIs and the related types of things.
         MS. LUI:  Okay.  I'm not sure if you are familiar with the
     work that has already been done by KTIs such as effects on flow and
     container life and source term.
         DR. WYMER:  In a general sense.
         MS. LUI:  The attempt is to integrate the acceptance
     criteria and review methods currently in those IRSRs as the first step
     towards what's going to be in the review plan.  In fact, if you spent a
     little bit of time looking at the example, as the backup slides to this
     particular package, you will probably get a flavor of the path that the
     staff is taking.
         DR. WYMER:  Yes.  It seems to me those criteria are
     fundamentally important.
         MS. LUI:  Right.  Those acceptance criteria are basically
     formulated to address the technical requirements in the rule.  You will
     find a lot more detail in the review methods or review procedures that
     are associated with each of the acceptance criteria.
         DR. FAIRHURST:  With the intrusive events, we'd surely be
     talking about vulcanism and things of this kind, but how are you going
     to deal with -- I heard expert group, but with human intrusion?
         MS. LUI:  Human intrusion is a stylized analysis --
         DR. FAIRHURST:  Of a single hole, right?
         MS. LUI:  Right, a single bore hole, right.  And in the -- I
     have a place in mind.  I mean, that's me speaking, for where it is going
     to appear.  In fact, you can probably see that on page -- I forgot which
     page -- it's page 13 to B-1.
         It did not show, it did not say human intrusion but
     63.113(d) is a post-closure performance objective.
         DR. FAIRHURST:  63.113(d)?
         MS. LUI:  (d) as in "David" is performance objective for
     evaluation of human intrusion.
         DR. FAIRHURST:  And does that stylizing involve penetrating
     a waste package or something like that?
         MS. LUI:  Yes.
         DR. FAIRHURST:  Okay.
         MS. LUI:  So it is incorporated but again this is a work in
     progress so we don't have the detail for you at this point but as we
     develop more detail we will come back and talk to you about it.
         DR. FAIRHURST:  But presumably it would involve bringing
     cuttings to the surface that are radioactive?  It didn't involve any
     flow?
         DR. McCONNELL:  Tim?
         MR. McCARTIN:  Well, to an extent but the critical group is
     still the same as what is done for post-closure and we would not expect
     any drill cuttings at the surface to have any significant impact at 20
     kilometers away.
         DR. FAIRHURST:  Right, okay.
         MR. McCARTIN:  There isn't -- the stylized calculation uses
     the biosphere critical group assumptions as is done in the PA, so there
     isn't another -- there isn't an attempt to look at for example the well
     drilling crew.  There is not a dose to the well drilling crew.
         DR. FAIRHURST:  Maybe I am naive, but what impact could
     human intrusion have --?
         MR. McCARTIN:  Well, as the NAS recommended, the desire of
     this calculation was to see the resiliency of the repository for the
     critical group and not necessarily trying to look at, say, what would
     happen to the well drilling crew, whatever, so it is looking to see
     would a single penetration of the repository so degrade the performance
     that the critical group would be adversely affected and --
         DR. FAIRHURST:  I see, okay.
         MR. McCARTIN:  My understanding that part of that was a
     reaction to the experience at WIPP, where a well penetration was the
     only significant source of getting consequences and that is pretty much
     where the recommendation came from, but it was looking at the overall
     repository performance in the context of the overall performance
     assessment.
         DR. FAIRHURST:  As I recall, in the TYMS report, there was a
     suggestion that one consider a second well if the effects were not
     additive.
         In other words if some mechanism for release were generated
     which would not be there by the one, which is again coming back from
     WIPP.
         MR. McCARTIN:  Yes.
         DR. FAIRHURST:  E-1, E-2 thing where two wells causes --
         MR. McCARTIN:  Right.
         DR. FAIRHURST:  But it seems like it is a very different
     kettle of fish.
         DR. GARRICK:  When you talk about a stylized scenario, how
     far can you take it?  Drilling a hole into a canister is one thing,
     drilling a hole into a canister and bringing cuttings up to the surface
     is another thing and drilling a hole and bringing cuttings up to the
     surface and having a rain-out and a stream is a third thing and so on
     and so forth.
         MR. McCARTIN:  Right.  Well, the way we have proposed it in
     Part 63, and we will be interested to see what kinds of comments we will
     get, which we have not gotten any to date, but on the stylized
     calculation but we are assuming a somewhat typical drilling event where
     the cuttings would go up to the surface.
         We would not suggest that the cuttings, say, go down to the
     water table, but we would expect a penetration through the waste
     canister all the way to the water table so you now have a relatively
     fast path of potential ingress of some water from the surface to the
     waste package and to the water table, and you would be examining that
     particular event.
         DR. GARRICK:  So in principle you could have a couple of
     pathways.
         You could have a pathway -- I am thinking of to the critical
     group -- you could have a surface pathway and you could have a
     groundwater pathway.
         DR. HORNBERGER:  A surface pathway airborne?
         DR. GARRICK:  Well, either airborne or liquid.
         DR. FAIRHURST:  Where, on the surface?
         DR. GARRICK:  Well, what happens when it rains?
         DR. HORNBERGER:  In the Amargosa Valley there is no stream.
         MR. McCARTIN:  We are not anticipating any impact from the
     surface cuttings but you're right.  I mean there is a potential.
         DR. FAIRHURST:  There is another big difference between WIPP
     and Yucca Mountain and that is what you put the waste in.
         We're having what? -- 10 to 20 centimeters stainless steel
     and then C-22 alloy, so for a long period of time the probability of a
     drill ever getting through that is extremely small.
         If you take standard drilling practice, the moment anything
     hits a metallic it wrecks, so --
         DR. HORNBERGER:  Titanium shields.
         DR. FAIRHURST:  There's titanium shield, the lot, you know?
         DR. HORNBERGER:  Steel, thick steel.
         DR. FAIRHURST:  These are 55 gallon drums at WIPP --
         DR. GARRICK:  Yes, but the concentrations between the two
     are widely different.
         DR. FAIRHURST:  I agree.
         MR. McCARTIN:  It certainly is conservative to assume at an
     early time someone can drill through that container.  It would take a
     fairly --
         DR. FAIRHURST:  Aggressive?
         MR. McCARTIN:  -- dedicated well drill team.
         [Laughter.]
         DR. HORNBERGER:  A laser drill in the next century.
         DR. GARRICK:  All right.  Any other comments, questions --
     any questions from the Staff?
         DR. CAMPBELL:  Yes.  One of the areas when the committee
     reviewed the viability assessment we also looked at the IRSRs in Rev. 1
     to the IRSRs and you mentioned the alternative models that are part of
     the review methods as DOE considered alternative models.
         When I read that in -- there is some version of it in each
     of the IRSRs -- it appeared not to be the intent of the Staff to say
     that they had to use the most conservative conceptual model, so in
     reviewing whether or not DOE has considered appropriately the
     alternative conceptual models obviously you have in your mind some
     concept of probability for the different conceptual models, and I was
     wondering how that is factored in and where it may come from.
         MS. LUI:  That probability or the nonexistence of that
     probability basically depends on our current understanding, our current
     belief of what is most likely going to happen.
         You can always assume the worst case scenario --
         DR. GARRICK:  You're already got it --
         MS. LUI:  -- like for example take, say, wind direction,
     wind rows.  It is not likely that a wind row always blows toward the
     critical group but there are aspects in DOE's analysis that they can
     decide, okay, we are just going use that or we are trying to defend what
     is going or what is not going to happen, let's just take what we believe
     at this point is going to be the most conservative approach, so it's
     going to be a knowledge dependence there.
         DR. GARRICK:  It isn't in here?
         DR. McCONNELL:  Yes, I think DOE if they want to can take a
     bounding approach to particular issues or factors including conceptual
     models and so we are not telling them they have to do that, but we are
     also not limiting their flexibility in doing that in particular areas
     where they may not want to narrow the uncertainty.  They may just want
     to bound the problem.  I don't know whether we are addressing your
     question or not.
         DR. HORNBERGER:  I was just going to say it ties in with
     that question, I can ask it in this context, that Andy did.
         Of course there are similar questions that one can ask about
     all the acceptance criteria, which are a lot of them are you are going
     to judge that DOE has presented sufficient data to support the
     conceptual model or the model for seepage into a drift, et cetera, et
     cetera, and it is one of these things where it is I don't know how to
     define it but I will know it when I see it, you know?  I will know it is
     sufficient when I see it, and the question is, as Andy said, with
     different conceptual models somewhere back there you are either going --
     you either have a notion that yes, an equivalent continuum is reasonable
     for this place, and if they have that as their primary model it is okay,
     or you say, boy, an equivalent continuum model just won't do and if DOE
     comes forward with that, that is not going to be sufficient, so you have
     in your mind already -- you have to have a subjective probability
     associated with all these things.
         I think this is probably okay as long as the license
     application does in fact come through in 2002 because there will be
     corporate memory here and you people have interacted.  If for some
     reason the license application doesn't come through until 2012, which we
     may laugh at now but it is not all that inconceivable given the history
     of the program shall I say --
         There will be limited corporate memory and some of these
     ideas may change.  Maybe that's okay, too.
         MS. LUI:  Well, I hear what you're saying, Dr. Hornberger,
     but we all have to understand that the performance assessment, why is it
     kind of not as transparent to outsiders, because there are so much
     interdependency on the various parts.  DOE may elect to do a
     conservative analysis for one part because they are being, quote
     unquote, as realistic as possible to other parts, or they may change
     their approach.  So there will be billions of permutations out there. 
     It will be extremely difficult for us to write down every single thing.
         DR. McCONNELL:  Yes.  And I think we're trying to use the
     issue-resolution status report too as a mechanism for documenting some
     of the thought process that's going along.  So hopefully if it does --
         DR. HORNBERGER:  Which is more important, because, as
     Christiana said -- I agree; I wasn't suggesting that you could write all
     this down.  You can't.  You're doing what you can.  But it's still this
     philosophical question you're left with.  And I think that that's right. 
     If you write down the thought process, that's the best record you can
     have.
         DR. CAMPBELL:  Are you guys going to still brief the
     Committee on sensitivity studies from TPA 3.2 in the near future?  Is
     that still on the books for San Antonio?
         DR. McCONNELL:  I believe it is; yes.  We intend to do that. 
     We have an interchange with DOE at the end of the month, and then
     depending on I think the Committee's agenda, I think we'll try to fit in
     as much as we can.  But I think we'll have to talk to you and find out
     what your schedule's like.
         DR. CAMPBELL:  Because it might be very useful.
         DR. WYMER:  We have been after that for a long time.
         DR. McCONNELL:  Okay.
         DR. GARRICK:  Any other questions or comments from Committee
     staff?  Anybody from the audience?
         If not, we want to thank you.  I don't think at this point
     you're asking for a letter.  Unless we in our discussions of it find
     some issues that we feel very concerned about and want to call to the
     Commission's attention, we'll take this as information for now, and
     expect to hear from you again.
         Okay.
         MS. LUI:  Okay.
         DR. GARRICK:  Okay.  Thank you very much.
         MS. LUI:  Thank you.
         DR. GARRICK:  Let's see what's on our agenda now.
         DR. FAIRHURST:  That's it.
         DR. GARRICK:  It's up to Charles.  Do you want to do your
     reports as we had planned tomorrow when you do the rest?
         DR. FAIRHURST:  Sure.  Yes, because I'll have copies of the
     white paper too, but we can talk about that later.
         DR. GARRICK:  In view of that, and unless there's further
     comment, questions, we will adjourn.
         [Whereupon, at 5:20 p.m., the meeting was concluded.]

Page Last Reviewed/Updated Monday, January 27, 2014