482nd Meeting - May 10, 2001

                Official Transcript of Proceedings

                  NUCLEAR REGULATORY COMMISSION



Title:                    Advisory Committee on Reactor Safeguards
                               482nd Meeting



Docket Number:  (not applicable)



Location:                 Rockville, Maryland



Date:                     Thursday, May 10, 2001







Work Order No.: NRC-206                               Pages 1-172





                   NEAL R. GROSS AND CO., INC.
                 Court Reporters and Transcribers
                  1323 Rhode Island Avenue, N.W.
                     Washington, D.C.  20005
                          (202) 234-4433.             UNITED STATES OF AMERICA
           NUCLEAR REGULATORY COMMISSION
                     + + + + +
                   482nd MEETING
     ADVISORY COMMITTEE ON REACTOR SAFEGUARDS
                      (ACRS)
                     + + + + +
                     THURSDAY,
                   MAY 10, 2001
                     + + + + +
                ROCKVILLE, MARYLAND
                     + + + + +
                 The Advisory Committee met at the Nuclear
           Regulatory Commission, Two White Flint North, Room
           T2B3, 11545 Rockville Pike, at 8:30 a.m., Dr. George
           Apostolakis, Chairman, presiding.
           PRESENT:
           GEORGE E. APOSTOLAKIS      Chairman
           MARIO V. BONACA            Vice Chairman
           F. PETER FORD              Member
           THOMAS S. KRESS            Member-at-Large
           GRAHAM M. LEITCH           Member
           DANA A. POWERS             Member
           WILLIAM J. SHACK           Member
           JOHN D. SIEBER             Member.           PRESENT:
           ROBERT E. UHRIG            Member
           GRAHAM B. WALLIS           Member
           
           STAFF PRESENT:
           JOHN T. LARKINS            Executive Director
                                      ACRS/ACNW
           SAM DURAISWAMY             ACRS
           ROB ELLIOTT                ACRS
           CAROL A. HARRIS            ACRS/ACNW
           HOWARD J. LARSON           ACNW
           JAMES E. LYONS             Associate Director for
                                      Technical Support
           MICHAEL T. MARKLEY         ACRS
           
           ALSO PRESENT:
           RAJ AULUCK                 NRR
           PATRICK BARANOWSKY         NRR
           TOM BOYCE                  NRR
           BENNETT BRADY              RES
           J.E. CARRASCO              NRR
           BOB CHRISTIE               Performance Technology
           EUGENE COBEY               NRR
           JIM DAVIS                  NRR
           BARRY ELLIOT               NRR/DE/EMCB.           ALSO PRESENT:
           JOHN FAIR                  NRR
           HOSSEIN G. HAMZEHEE        NRR
           STEVE HOFFMAN              NRR
           TOM HOUGHTON               NEI
           RANDY HUTCHINSON           Entergy Nuclear
           PT KUO                     NRR
           STEVEN E. MAYS             NRR
           HO NIGH                    OCM/RAM
           DUC NGUYEN                 NRR
           ROBERT PRATO               NRR
           DEANN RALEIGH              LIS, Scientech
           MARK RINCKEL               Framatome-ANP
           MARK SATORIUS              NRR
           PAUL SHEMANSKI             NRR
           JENNY WEIL                 McGraw-Hill
           PETER WILSON               NRR
           STEVEN WEST                NRR
           TOM WOLF                   RES
           GARRY G. YOUNG             Entergy Services
           BOB YOUNGBLOOD             ISL
           
           
           
           .                                 I-N-D-E-X
           AGENDA                                          PAGE
           Opening Remarks by the ACRS Chairman
                 Opening Statement. . . . . . . . . . . . . . 5
                 Items of Current Interest. . . . . . . . . . 6
           Final Review of the License Renewal Application
           For Arkansas
                 Briefing by and Discussion with. . . . . . . 7
                 Representatives of the NRC Staff and
                 Entergy Operations, Inc. Regarding the
                 License Renewal Application and for ANO,
                 Unit 1 and the Associated Staff's Safety
                 Evaluation Report
           Risk-Based Performance Indicators
                 Briefing by and Discussion with. . . . . . .69
                 Representatives of the NRC Staff Regarding
                 the Staff's Draft Document Entitled, "Risk-
                 Based Performance Indicators:  Results of
                 Phase I Development," and Related Matters
           
           
           
           
           
           .                           P-R-O-C-E-E-D-I-N-G-S
                                                    (8:30 a.m.)
                       CHAIRMAN APOSTOLAKIS:  The meeting will 
           now come to order.  This is the first day of the 482nd
           meeting of the Advisory Committee on Reactor
           Safeguards.  During today's meeting, the Committee
           will consider the following:  Final review of the
           license renewal application for Arkansas Nuclear One,
           Unit 1, risk-based performance indicators, discussion
           of South Texas Project Nuclear Operating Company
           exemption request, and proposed ACRS reports.
                       In addition, the Committee members will
           attend the Commission meeting on the Office of Nuclear
           Regulatory Research Programs and Performance, which
           will be held at the Commissioners' Conference Room
           between 10:30 and 12:30 this morning.
                       This meeting is being conducted in
           accordance with the provisions of the Federal Advisory
           Committee Act.  Dr. John T. Larkins is a Designated
           Federal Official for the initial portion of this
           meeting.  We have received no written comments or
           requests for time to make oral statements from members
           of the public regarding today's sessions.
                       A transcript of portions of the meeting is
           being kept, and it is requested that the speakers use
           one of the microphones, identify themselves, and speak
           with sufficient clarity and volume so that they can be
           readily heard.
                       I will begin with some items of current
           interest.  I'm very pleased to announce that the Board
           of Directors of the American Nuclear Society has
           elected Dr. Tom Kress, a Fellow of the Society.  This
           honor recognizes Tom's outstanding efforts in the area
           of nuclear health, safety, and regulation.  It is
           certainly a well-deserved honor, and our Committee is
           fortunate to have members of this caliber.
                       (Applause.)
                       CHAIRMAN APOSTOLAKIS:  Because of the
           unavailability of staff documents, Committee review of
           the South Texas Project exemption request and spent
           fuel pool accident risk of the Commission in plants,
           which was scheduled for this meeting, has been
           postponed to future meetings.  As a result, there will
           be no Saturday meeting this month, and the meeting
           will be adjourned around 4 p.m. on Friday, May 11.
                       I hope the staff recognizes the impact on
           ACRS resources of dropping items from the ACRS meeting
           agenda at the last minute.  The ACRS Executive
           Director has been discussing this concern with EDO.
                       I'd like to draw the members' attention to
           the items of interest, the pink cover.  The three
           speeches, or comments, by commissioners; comments by
           Commissioner Dicus at the Texas Women's University
           Honors Convocation on April 19 where she was honored
           as a distinguished alumna of the University; the
           opening statement of Chairman Meserve at the press
           conference that he held on April 26, and remarks, or
           a paper, that Commissioner Diaz gave at the meeting in
           Germany of the Internationale Lander Kommission
           Kertechnik on April 26.
                       Finally, a fourth item of interest is the
           testimony by Mr. Lochbaum of the Union of Concerned
           Scientists on Nuclear Power before the Clean Air,
           Wetlands, Private Property, and Nuclear Safety
           Subcommittee of the U.S. Senate Committee on
           Environment and Public Works.
                       And the first item on our agenda is the
           final review of the license renewal application for
           Arkansas Nuclear One, Unit 1.  Dr. Bonaca is a member. 
           Mario, it's yours.
                       DR. BONACA:  Thank you, Mr. Chairman.  Our
           Subcommittee on Plant License Renewal met with the
           applicant and the staff on February 22, 2001 to review
           the license renewal application.  At the time, we
           noted two things:  One, is that the application was
           quite clear and easy to follow on the part of the
           members that facilitated that review.  The second
           issue was that there was only a few open items
           remaining between the staff and the applicant to be
           closed.
                       Because of those two circumstances, we
           recommended to the Committee that we would not have an
           interim meeting, and therefore we did not have that. 
           We are here now to discuss the review of the final SCR 
           with open items closed.  Therefore, this is really the
           final report regarding license renewal.  And with
           that, I will let the staff and -- actually, I would
           like to let the staff, first of all, initiate a
           meeting.
                       MR. KUO:  Thank you, Dr. Bonaca, and good
           morning to the Committee.  My name is PT Kuo, the
           Chief of Engineering Section of the License Renewal &
           Standardization Branch.  The staff is ready to report
           to the Commission its review of Arkansas Unit One
           license renewal application.  The presentation will be
           made by Mr. Robert Prato this morning.  He will first
           give you an overview of the project, followed by the
           applicant's presentation on its license renewal
           application.  And then Mr. Prato will summarize the
           results of staff's detailed technical review.
                       I would like to just make one observation
           since Dr. Bonaca already mentioned that from this
           review I know of no open items left unresolved.  And
           the remark I would like to mention is that this review
           is about eight months ahead of schedule.  It's
           remarkable; it's very impressive.  I also was told
           that Mr. Hutchinson, the Senior Vice President for
           Entergy Nuclear, would like to make a few remarks
           after I finish my remarks.  And after Mr. Hutchinson's
           remarks, I will then turn the presentation to Mr.
           Prato.
                       MR. HUTCHINSON:  I'm Randy Hutchinson,
           Senior Vice President for Entergy Nuclear.  We're
           pleased to be here today and to be a part of this
           review of ANO Nuclear Unit One's license renewal
           process.
                       We, as you know, followed just behind the
           Oconee application, which is a sister plant, and we
           incorporated a number of lessons learned.  In between
           incorporating those lessons learned from the Oconee
           process of what's been done in the industry and the
           guidance provided by the Nuclear Regulatory Commission
           in terms of license application format and that sort
           of thing, we're able to put together a license
           application, and as a result of that, one that had
           very few open items, a substantially reduced number of
           requests for additional information.
                       So, to us, this was really a pretty
           pleasant experience.  We found the license renewal
           process to be stable and predictable, and it worked
           very well for us.  And Mr. Garry Young, the Project
           Manager for our ANO project, will be making our part
           of the presentation when we get to that.  Thank you.
                       MR. KUO:  And with that, Mr. Prato?
                       MR. PRATO:  Good morning.  I'm Bob Prato. 
           I work in License Renewal Branch in NRR.  Before I get
           into the overview, I'd like to inform the Commission
           that we used Oconee as a benchmark for our
           presentation, as we did for the Subcommittee.  We did
           that for a number of reasons.  First of all, Oconee
           and Arkansas Nuclear One are sister plants on the NSSS
           side.  The other reason is they used the same topical
           reports that was used in the review for the Oconee
           license renewal application.  And the third reason is,
           is that ANO incorporated a lot of the lessons learned
           from the Oconee application.  All of the open items
           that Oconee had, most of them at least, were resolved
           in the application for ANO 1.
                       So as I go through my presentation, I'm
           going to be identifying some items.  Those items are
           the items that ANO 1 captured in their application
           without any concerns as there were for Oconee.  It's
           not an intent to comment on Oconee's application. 
           Oconee did a great job.  They were the first one up at
           bat -- one of the first ones up at bat.  And I just
           wanted you to know they took advantage of the lessons
           learned from the Oconee application.
                       To begin with the overview, the unit
           description for ANO 1 is ANO 1 is a two-unit site
           consisting of a Babcock and Wilcox Pressurized Water
           Reactor and a Combustion Engineering Pressurized Water
           Reactor.  And it's located in Pope County in Central
           Arkansas on Lake Dardanelle.
                       On January 31, 2000, the applicant
           submitted a license renewal application for Arkansas
           Nuclear One, Unit 1, the 2,568 megawatt thermal
           Babcock and Wilcox Pressurized Water Reactor.  Unit 1
           construction began in 1968, and it went commercial in
           1974.  The current facility operating license expires
           in May of 2014.  The facility is similar to Oconee in
           NSSS design.
                       ANO 1 site compared to the Oconee site,
           Oconee is a three-unit Babcock and Wilcox facility. 
           It has a standby shutdown facility, which is unique to
           the industry, which ANO 1 does not have.  And they use
           the Kiwi Hydroelectric Dam as their emergency source
           of power.  ANO 1 uses diesel generators as their
           emergency source of power, and they have an emergency
           cooling pond as an alternate source for the ultimate
           heat sink.
                       Comparing the two applications, Oconee's
           application was developed before the standard review
           plan was issued.  Therefore, it was broken down
           basically into five sections.  There was an
           introduction, a scoping, an aging effects section, an
           aging management review section, and a time-limited
           aging analysis section.  The ANO 1 application is more
           consistent with the standard review plan in which they
           combine Section 3 and 4 of the Oconee, so there's only
           an introduction, scoping, aging management review, and
           time-limited aging analysis.
                       In addition, they added an Appendix C. 
           And Appendix C are the aging effect tools.  One of the
           concerns with the Oconee application was applying
           consistently the aging effects for the different
           components that were inside containment and outside
           containment.  In the Appendix C, the tools that they
           used resolved that concern.
                       As far as the safety evaluation report,
           ANO 1 only had six open items.  They included a sodium
           hydroxide orifice, scoping question -- fire protection
           scoping question, FSAR supplement additional
           information needed in the FSAR supplement for, I
           believe it was, a total of 11 different items.  There
           was some concern with the Medium-Voltage Buried Cable
           Aging Management Program; there was some concern with
           the Boraflex, and there was some concerns with the
           trending of the tendon pre-stress forces.  We will get
           into all of those specifically as we go through the
           aging management review presentation.
                       At this time, I'm going to turn this over
           to Garry Young, of Entergy, who will cover the
           application.
                       MR. YOUNG:  Thank you, Bob.  My name is
           Gary Young, and I was the Project Manager for the ANO
           1 license renewal application for Entergy.
                       The first thing I'd like to go over with
           you is on slide 4, which is what we call the document
           hierarchy for our application.  The top item on this
           slide shows the actual application itself, which was
           the package that we submitted to the NRC for review. 
           Below that you'll see a list of several documents
           here, which are what we call our on-site documentation
           that was a backup, or supporting documentation, that
           supported the statements that were made in the
           application.  And at the very bottom of that slide
           you'll see the basic breakdown of the different types
           of aging management reviews -- scoping and aging
           management reviews that were done.
                       We broke them into categories.  We had the
           class 1 mechanical reviews, which were based on the
           B&W topical reports.  These are the same reports that
           Oconee used in preparing their application.  The
           second grouping is the non-class 1 mechanical.  These
           are the systems that were not covered generically by
           the topical reports, and we had to review those on a
           site-specific basis.  The third grouping is the
           structural aging management reviews.  Those were based
           on some industry guidelines that were prepared by the
           B&W Owners Group.  And then the next one is the
           electrical grouping, and these were based on Sandia
           aging management guideline documents that were made
           available to the industry.  And those are the major
           categories.
                       In addition to that, we did a TLAA review,
           which was separate from the aging management reviews,
           although closely related.  We also did an
           environmental review, which was part of the Part 51
           review requirements for license renewal.  And then we
           summarized in one document all of the aging management
           programs that were identified in all of these various
           aging management review reports.
                       So, total, there were probably around 50
           engineering reports, individual reports that were
           generated to support the application that was
           submitted to the NRC for review.
                       Okay.  Then on the next slide, on page 5,
           I'd like to go into the -- I'm going to talk through
           each one of the areas of the application, a little
           quick summary on how we did the review that went --
           the results that were documented in the application. 
           And the first part of that is the scoping.  And the
           scoping is based on the rule requirements that
           identify what is to be in scope for license renewal. 
           We used the guidelines from NEI 95-10 to prepare this
           portion of our application.
                       There are three major categories of
           scoping.  The first category is safety-related
           equipment, which is in Part 54.4(a)(1).  There's a
           definition there of what is safety related.  For ANO
           1, we had a site-specific component levelfic component level Q-list.  And
           this Q-list uses a definition for safety related that
           matches the definition in the 54.4(a)(1).  So we were
           able to go right to our component level Q-list at the
           site and basically print out a list of the equipment
           that was in scope that met the (a)(1) requirement.
                       DR. BONACA:  I have a question.  I would
           like a clarification.  During the Subcommittee
           meeting, you indicated that the scoping and screening
           for mechanical class 1 components was done using the
           B&W --
                       MR. YOUNG:  Yes.
                       DR. BONACA:  -- Owners Group topical
           reports.  Could you expand on that?  Is it the whole
           class 1 components, the mechanical ones were done from
           those topical reports?  Or did you have to use the Q-
           list really to include also the Bechtel components?
                       MR. YOUNG:  We did use the topical report
           as the core of our review, and then we did a site-
           specific comparison against the topical to ensure that
           we were enveloped.  We did have some areas where we
           were different, and so we documented that in our site-
           specific documentation.
                       DR. BONACA:  Because you had a number of
           Bechtel components.
                       MR. YOUNG:  Yes.
                       DR. BONACA:  I believe that they would not
           be identified by the -- or would they be identified in
           the topicals?
                       MR. YOUNG:  No.  The B&W components --
           now, Mark Rinckel is here from Framatome, and he
           helped us with that.  Go ahead, Mark.
                       MR. RINCKEL:  Yes.  This is Mark Rinckel
           of Framatome.  We did include in the RCS piping report
           Bechtel-supplied or AE-supplied piping.  And so what
           we had to do for Arkansas was to show how we're
           bounded, and so we had to reference site-specific
           information.  It was included in the topical.
                       DR. BONACA:  So also the Bechtel
           component.
                       MR. RINCKEL:  That's correct.
                       DR. BONACA:  Thank you.
                       MR. YOUNG:  Okay.  The second category is
           the non-safety-related structure systems and
           components that are part of the 54.4(a)(2).  These are
           non-safety-related components that could prevent the
           accomplishment of a safety function.  For ANO 1, we
           had very few components that fall in this category,
           because of our definition of Q or safety related would
           include most of these support type systems that are
           sometimes classified as non-safety related.
                       We did have a few, though, that did fall
           in this category.  For example, our category two over
           one seismic supports were in this category and a few
           others.  So we identify some additional equipment that
           fell into this (a)(2) category.
                       And then (a)(3) was the final category for
           scoping, which included what we sometimes refer to as
           the regulated events -- fire protection, EQ,
           pressurized thermal shock, ATWS, and station blackout. 
           And here we used our site-specific documentation for
           each one of these reviews and identified the
           structures and components that were relied upon to
           accommodate these regulated events.
                       DR. BONACA:  Just a question:  On the
           seismic two over one, you included not only the
           supports but also the piping segments.
                       MR. YOUNG:  Yes.  Yes.  When we did our
           aging management review for the structural, we
           included -- the way we did the program for evaluating
           the aging effects on the supports and the piping is
           the Maintenance Rule Walkdown Program.  So when they
           do that walkdown, they include both the hangers and
           the piping that the hangers support.
                       DR. BONACA:  Yes.  Because I know it's an
           issue that is being disputed on a different
           application, and I just wonder -- in fact, I don't
           know where the industry is on this.  I mean is it --
           you didn't have any objection to -- just your program
           actually included the segments.
                       MR. YOUNG:  Right.
                       DR. BONACA:  So you didn't have to --
                       MR. YOUNG:  The existing program.
                       DR. BONACA:  -- make an exception.
                       MR. YOUNG:  Yes, right.
                       DR. BONACA:  Okay.  Thank you.
                       MR. YOUNG:  Okay.  And that's the summary
           of the scoping section of the application.
                       On the next slide, on page 6, is the
           screening activities.  Once we had completed the
           scoping, we went through the screening process to
           determine which components in those systems and
           structures that were in scope required an aging
           management review.  Again, we used the material in the
           rule itself and the guidance document that was
           provided by NEI in 95-10.
                       The first effort was to identify the
           passive structures and components that had an intended
           function that required an aging management reviews. 
           And the definitions for passive and the intended
           functions are covered in the rule.  We applied those
           definitions.  We also then identified those passive
           structures and components that were not subject to
           periodic replacement.  In other words, they were long-
           lived and passive.
                       The screening for the mechanical
           components was, again, done for -- the class 1 was
           done using the B&W topical reports, the same reports
           that Oconee used.  And then for the non-class 1
           mechanical, we did a site-specific review using the
           guidance in NEI 95-10.  For the electrical and the
           structural components, these were also performed on a
           site-specific basis using the guidance of NEI 95-10. 
           Okay.  And that's a summary of the screening process.
                       Then on the next slide, on page 7, we go
           into the actual aging effects identification.  At this
           point, again, it's all split up by discipline.  We use
           the guidance of NEI 95-10.  The aging effects were
           identified for the class 1 components using the B&W
           topical reports.  The non-class 1 was done on a site-
           specific basis.  At this point, we did rely on another
           B&W guidance document, which is sometimes referred to
           as the mechanical tools.  This is the information that
           we summarized in Appendix C of our application.
                       The mechanical tools was a document that
           was created to help ensure consistency when we did our
           aging effects review.  And this document was used for
           -- we had about 25 non-class 1 mechanical systems that
           we had to perform an aging management review on.  So
           we relied upon this B&W guidance document to help us
           go through that process and make sure that we
           consistently identified the same aging effects for
           each system.  That document --
                       DR. BONACA:  I'd like to ask a question. 
           This must have been a pretty time-consuming portion of
           the effort.
                       MR. YOUNG:  Yes.
                       DR. BONACA:  If the GALL report had been
           finalized by the time you were preparing the
           application, would it have been much more efficient to
           use that or would you have used that?
                       MR. YOUNG:  Yes.
                       DR. BONACA:  I'm trying to understand how
           much the process would have been helped by the
           existence of a generic document like the GALL report.
                       MR. YOUNG:  This portion of the process I
           don't think would be any shorter with the GALL report,
           but we would definitely have used it.  It would have
           helped validate the conclusions that we came to.  I
           think the overall intent of the GALL report is for the
           utilities to use to validate the work that is done,
           and then be able to then, with some confidence, go
           forward and say, "Well, this already has been reviewed
           by NRC, so we don't have to worry about new issues
           coming up."
                       DR. BONACA:  It would certainly minimize
           a number of questions.
                       MR. YOUNG:  Yes.  That's the area where I
           think the benefit is, is that once you go through the
           process and use the GALL to validate what you've done,
           then you have some confidence going in with your
           application that that's essentially been pre-reviewed,
           and you know what the questions might be for that.
                       DR. BONACA:  Thank you.
                       MR. YOUNG:  Okay.  The next area was the
           electrical review.  Here we used the Sandia aging
           management guidelines and what's the spaces approach. 
           This was, again, to help ensure consistency.  The
           Sandia guideline was a basis for that.  This is the
           same type of review that Oconee did and Calvert
           Cliffs, so we were following the examples that had
           already been set for the electrical.
                       And then for the structural and structural
           components, there was another B&W guidance document
           that's sometimes referred to as the structural tools. 
           And that document was used, again, to help us ensure
           consistency as we went through all the various reviews
           of the buildings that were in the scope of license
           renewal.
                       There were several -- at this portion of
           the review, and in our application, there were several
           lessons learned from both Calvert Cliffs and Oconee
           that we applied in our application, and we feel like
           that was a big part of the reason for the reduction in
           the number of requests for additional information was
           our efforts in this particular area to deal with
           issues that had come up previously on the first two
           applications in our application.
                       And in addition to that, we got some
           guidance from the NRC in the form of a standard
           format.  And in that standard format for the
           application, we also got some guidance on how to
           present the material based on the review results that
           came out Calvert Cliffs and Oconee.  And we tried to
           apply that lessons learned from the NRC staff, and we
           think that, too, was a big benefit in reducing the
           number of RAIs.
                       Okay.  Then on page 8 of the slides is the
           aging management programs.  Once we had identified the
           aging effects, we looked at the aging management
           programs that were needed to manage those effects.  We
           identified a total of 29 aging management programs, or
           actually major groupings of programs.  Some of these
           program titles you see here are actually a collection
           of programs.  Out of that, only seven of the 29 are
           new programs.  There were 22 that were existing
           programs that were already in place at A&O.
                       The new programs included such programs as
           buried piping inspection, electrical component
           inspection, some pressurizer examinations, the vessel
           internals program, spent fuel pool monitoring and some
           others.  And they're listed in a later slide.
                       For the existing programs, there were 22
           of those, and those included such things as our ASME
           Section 11 In-Service Inspection Program, our Borax
           Acid Corrosion Prevention Program, chemistry control,
           which included primary and secondary chemistry, our
           Preventive Maintenance Program, and this is one that
           included a large number of preventive maintenance
           activities.  So even though it's only listed as one
           program, it includes a large number of individual
           preventive maintenance activities.  And, again, there
           were a total of 22 of those.
                       These 22 programs probably represent aging
           management programs for 95 to 99 percent of the
           components that were in scope.  The seven new programs
           are actually very limited in scope as far as the
           number of components that they cover.  So the existing
           programs cover the majority of the equipment.
                       DR. BONACA:  Mr. Young, one of those
           existing programs is this CRDM Nozzle and Other Vessel
           Closure Penetration Inspection Program.
                       MR. YOUNG:  Yes.
                       DR. BONACA:  I'm sure this is a question
           you were expecting to come today.  And it clearly
           gives you an opportunity to see how effective the
           problem that you reference in the application would be
           in light of the recent findings of the colony and
           those at Arkansas, I believe.  And I have a question
           that you would comment on that.  And also that comment
           on possible changes you may have to make to the
           program --
                       MR. YOUNG:  Okay.
                       DR. BONACA:  -- to deal with the findings.
                       MR. YOUNG:  Okay.  Yes, the cracking
           that's been identified at Oconee and at Arkansas in
           the CRDM nozzles was found using our existing aging
           management programs.  The Boric Acid Corrosion
           Prevention Program was probably the lead indicator, at
           least at Arkansas, when we went into the inspection --
           the beginning of the refueling outage, we found the
           boric acid crystals on the head of the vessel, and
           that led to subsequent investigations that identified
           the cracking that had occurred in the CRDM nozzles or
           in the weld.
                       From that, we initiated our corrective
           action program, which is also part of our aging
           management programs to do a root cause evaluation and
           to look at the extent of condition, and to look at any
           needs to modifications to existing programs.  And the
           two programs that may be affected -- or actually will
           be affected by those findings are the Alloy 600
           Program and the CRDM Nozzle Inspection Program.
                 So those activities are currently ongoing. 
           They're part of our existing aging management
           programs, and we expect some modifications to those
           existing programs based on the operating experience
           that we've gained recently.
                       DR. BONACA:  Is this the first indication
           of cracking that you have seen at Arkansas One?
                       MR. YOUNG:  Now, Mark Rinckel is here from
           Framatome.  He's our expert.  I'll let him answer that
           question.
                       MR. RINCKEL:  Yes.  Mark Rinckel, from
           Framatome.  Actually, it's the second; the first CRDM,
           but the first Alloy 600 issue was in the pressurizer
           nozzle.  It's a partial penetration nozzle that I
           think failed back in 1991.  So it's actually the
           second occurrence at Arkansas.
                       DR. BONACA:  Now, your program, if I
           remember, was referencing inspections of Oconee, and
           then you would perform the inspection based on the
           findings from the Oconee inspection, correct?
                       MR. YOUNG:  Well, I think on the CRDM
           nozzle, we are doing inspections in addition to the
           inspections at Oconee.  We're sharing information --
                       DR. BONACA:  Okay.
                       MR. YOUNG:  -- but we're not dependent on
           Oconee in this particular case.  There are some other
           programs where we are dependent, but in this case we
           are doing our own inspections and then comparing those
           results with Oconee to see if either one of us need to
           change our programs.
                       DR. BONACA:  So you have a commitment to
           inspection at every shutdown for refueling?
                       MR. YOUNG:  Mark, do you know the
           frequency on the inspections for the CRDM nozzle?
                       MR. RINCKEL:  I think that's still being
           determined, but initially, and what's stated in the
           application was that ANO was amongst the least
           susceptible and was not predicted to see any cracking
           until after 48 fpy.  Once the incident at Oconee Unit
           1 happened, that changed everything, and, as Garry
           said, the program has now changed.  But I think that's
           still being determined what the inspection frequency
           will be.  That hasn't been determined.
                       DR. BONACA:  Because I have an exhibit
           from some presentation from Framatome that shows
           Arkansas to be the one with an inspection at every
           cycle.  That's what I thought.  That's why I asked the
           question.
                       MR. YOUNG:  Yes.  There have been some
           very recent changes, and all of this is being
           coordinated through B&W Owners Group.  So the ultimate
           solution for the inspection frequency, both at
           Arkansas and at the other B&W plants is coordinated. 
           There have been meetings with the staff on that
           specific issue.  The long-term resolution will be the
           findings from the B&W Owners Group effort, and we'll
           incorporate those into our aging management programs.
                       DR. BONACA:  How difficult are these
           nozzles to access for inspections at Arkansas?
                       MR. YOUNG:  They're fairly difficult, yes. 
           You have to get --
                       DR. BONACA:  You do not -- I mean so many
           of the other PRWs have difficulty because they have
           insulation, and it makes it very impossible to see
           from outside unless the full installation is removed.
                       MR. YOUNG:  Well, I believe these
           inspections are on the inside -- the welds themselves
           are on the inside of the head, so I know the
           inspections that we did and the weld repair were done,
           obviously, with the head off the vessel on the
           headstand, and they had to work on the inside of the
           head.  Mark?
                       MR. RINCKEL:  Again, Mark Rinckel from
           Framatome.  The control drive service structure that
           we have at the insulation is not an issue.  We're able
           to see really all the CRDM penetrations with a visual
           inspection.  And I think we differ from Westinghouse
           and CE in that regard.  So being able to see the boric
           acid from the outside is not an issue for us.  And
           we've done safety assessments to show that the cracks
           are predominantly axially-oriented; this is not a
           safety concern for the B&W design plants.  So we
           should be able to see these.
                       DR. BONACA:  Now, just a question I have
           is regarding Oconee 3 since --
                       MR. RINCKEL:  Yes.
                       DR. BONACA:  Oconee 3, when was the last
           inspection they had prior to the February 2001
           inspection?
                       MR. RINCKEL:  As far as visual from the
           outside, I can't answer that.  The initial integrated
           program had Oconee Unit 2 as the lead indicator or
           would be the lead plant, and that inspection included
           a volumetric from the underside of the vessel.  And I
           believe that was somewhere around 1996.
                       DR. BONACA:  I'm trying to understand. 
           This inspection comes and there are up to nine nozzles
           --
                       MR. RINCKEL:  That's correct, yes.  At
           Oconee Unit 3 there are nine.
                       DR. BONACA:  What is the rate of
           development of these cracks?  That's what I'm trying
           to understand.  And to understand that rate of
           development I have to understand the period that went
           between the two inspections.
                       MR. RINCKEL:  Yes.  I think the EPRI model
           that was used to rank the CRDM penetration is being
           re-looked at and has been completely revised.  And
           they're really looking at Oconee Unit 3.  Everything
           is being normalized now to ONS 3, and I think all of
           the NW plants will be inspected -- TMI and Crystal
           River 3 as well.
                       MR. ELLIOT:  This is Barry Elliot, NRC. 
           There are two issues here:  CRDM nozzle cracking, and
           there's a susceptibility model which was used to pick
           the worst plants.  There's a new issue that has just
           occurred, which is Alloy 600 weld cracking.  That is
           a problem we're having now.  And that's a separate
           issue.  It is being addressed by the staff currently.
                       As far as the susceptibility model that
           inspects the 600 nozzles, that was used -- that model
           was used by this plant in an expanded scope beyond the
           CRDMs, used for other components.  And they have
           identified other components that need inspection.  The
           susceptibility is in question because, as Mark said,
           ANO 1 was not one of the limiting plants, and yet it
           had the cracking.
                       The cracking is probably also related to
           the weld problem, and that weld problem -- the problem
           is that once the crack goes through the weld, the
           reactor coolant now is not -- it is no longer under
           priority chemistry control.  It is now outside the
           confines of the reactor coolant pressure boundary, and
           it doesn't have the same chemistry anymore.  So the
           rate of crack growth is going to change from what we
           -- which all the models predict.  This is today issue. 
           It is being evaluated today, and we recently put out
           an information notice on this.
                       DR. SHACK:  But I think Mario's question
           was in the context of the license renewal application. 
           When you have a new phenomena here where you do have
           the weld cracking, you now have the potential for
           cracking from the OD of the nozzle, the
           circumferential cracking, which is really different
           than what people -- the safety evaluation was looking
           at axial cracking and the conclusions.  But that's
           incorporated into the license renewal process in the
           sense that you're doing this experience update, and
           that's why the staff feels that it can go ahead with
           the approval, even though you really don't know what
           the answer really is going to be at this point.
                       MR. ELLIOT:  Yes.  Our work is through the
           current license and whatever occurs during the current
           license, whatever inspections are going to be
           required, will be carried forward into the license
           renewal period.
                       MR. PRATO:  Part 54 requires that.
                       DR. BONACA:  Although if an issue of this
           nature would come during the extended license period,
           you would have the same ability of working with the
           licensee to develop changes to the program.  So I mean
           this is a -- okay.
                       MR. KUO:  Yes.  That is exactly right, Dr.
           Bonaca.  The regulatory process carries forward into
           the license renewal period.  Whatever the resolution
           here in today's space will be carried into the license
           renewal space.
                       DR. SHACK:  It just seems a little strange
           at the moment that you're approving an aging
           management program for the drive nozzles when at the
           moment you don't have an acceptable, or you don't know
           whether you have an acceptable aging management
           program.
                       MR. ELLIOT:  Well, I don't think we do
           have an acceptable aging management program simply
           because the cracks went right through.  But we will,
           and that's -- you know, over the long-term that's what
           the goal is, and that's where we're headed.
                       DR. SHACK:  Okay.  But that's a today
           issue, and it will be addressed and will just carry
           over.
                       MR. ELLIOT:  Yes.
                       DR. BONACA:  I'm not sure whether I would
           characterize it as not an acceptable aging management
           program for license renewal, I mean.  Today it is.
                       MR. ELLIOT:  Yes.
                       DR. BONACA:  For license renewal, all I
           need to see is you're flexible enough to incorporate
           promptly changes that result from the findings that
           you have.  I mean we cannot expect that there will be
           no issues arising over the next 40 years of operation
           or whatever.  The important thing is that there is a
           program in place, and it is flexible enough to
           accommodate and to incorporate changes.
                       MR. ELLIOT:  Yes.
                       DR. BONACA:  So you would conclude, too,
           that --
                       DR. SHACK:  I conclude that you're right.
                       (Laughter.)
                       DR. BONACA:  -- for license renewal that's
           an important issue.
                       MR. KUO:  I might also use this
           opportunity to mention that there are other technical
           reviewers here sitting in the audience that they are
           ready to answer any questions you might have later on.
                       DR. BONACA:  Yes.  No, I think it would
           inappropriate for us to expect a solution to this
           issue right this minute.  We are not expecting that. 
           But, certainly, an understanding of how, from a
           perspective of license renewal, the extent to which
           the programs which are committed to in the LRA are
           able to accommodate the findings.  And that's really
           proof to us that the programs are effective.
                       MR. YOUNG:  You know, we have the
           enveloping aging management program of our corrective
           action process, our Non-Conformance Program, and that
           applies to all of our individual aging management
           programs, including the CRDM Nozzle Program and the
           Alloy 600 Program.  But if we were to have some
           problem with one of our other programs in the future,
           they too would be subject to that Non-Conformance
           Program, which would include an evaluation of the root
           cause of the problem and corrective action, which
           would possibly include changes to those programs,
           either in frequency or inspection methods or scope. 
           So all of our aging management programs are subject to
           that adjustment as we get additional operating
           experience.
                       DR. BONACA:  Any other questions on this
           issue?  Thank you.
                       MR. YOUNG:  The next slide, on page 9, is
           the time-limited aging analysis.  And, again, this was
           done as somewhat of a separate activity from the aging
           management reviews, but it was also done in
           conjunction with those reviews.  We had a list of the
           TLAAs, which were evaluated.  This list is very
           similar to Oconee.  It included such things as the
           reactor vessel neutron embrittlement, metal fatigue,
           EQ, reactor building tendon pre-stress, and boraflex
           in the spent fuel racks, in addition to some others. 
           So, again, this list was consistent with the previous
           applicants, and we performed our evaluation and
           documented the results in the application.
                       Okay, the next slide, I would just like to
           conclude on the application itself.  We, again,
           utilized a number of the lessons learned from Oconee,
           from Calvert Cliffs, and from the rest of the
           industry.  The number of NRC requests for additional
           information was reduced relative to the Oconee
           application.  We had approximately 265 RAIs for
           Arkansas versus about 350 or so for Oconee.  Again, I
           think this, at least in some sense, reflects the
           application of lessons learned.  We took the RAIs from
           Oconee and tried to address as many as we could in our
           application.  Obviously here there's still room for
           improvement.  We'd like to get that number even lower
           than 265, and I think subsequent applicants will be
           able to do that.
                       On the number of SER open items, we had
           six and Oconee has approximately 49.  Again, we
           applied lessons learned from Oconee to assist us in
           reducing this number and the lessons learned from the
           NRC review of the Oconee application.
                       In summary, the license renewal
           application is stable and predictable, and we
           appreciate the efforts of the NRC staff to help us
           reduce the schedule for the review of this application
           from the original 30-month schedule, which we started
           out with in February of 2000, and we're now down to a
           17-month schedule.  So we really appreciate the
           efforts that went into accomplishing that.
                       And in particular, we'd like to
           acknowledge the effective management of this review by
           Mr. Bob Prato on the safety reviews and Mr. Tom Kenyon
           on the environmental reviews.  Both of these
           individuals were a great contribution to this process,
           and we appreciate their efforts.  And that's all I had
           on the application.  Thank you.
                       DR. BONACA:  Thank you.  Mr. Prato?
                       MR. PRATO:  Okay.  On the safety
           evaluation, again, I'm Bob Prato.  At the end of each
           of the major topics -- scoping, aging management
           review, and time-limited aging analysis -- there is a
           slide on the open items that were identified at the
           end of the first safety evaluation.  The last four
           pages of this handout has the summary of the open
           items and a summary of the resolution of each of those
           open items.  So as I go through this, we'll stop and
           we'll talk about the open items that we found in each
           of these sections.  And both myself and Mr. Young will
           try and answer any questions you might have as to the
           resolution.
                       I'll begin with scoping.  If you remember,
           the Oconee application had a number of questions on
           the scoping.  Both plants, Arkansas Nuclear 1 and
           Oconee Nuclear Station were originally designed to
           barriers to release of fission products.  However, in
           1987, about that time frame, ANO 1 performed a design
           basis reconstitution.  As part of this design basis
           reconstitution, they revised a Q-list to criteria that
           is consistent with 54.4(a)(1) for safety-related
           components and 54.4(a)(2) for non-safety-related
           components, which can effect safety-related functions.
                       They used the accident analysis in the US
           FSAR.  They used the environmental and exterior vents
           in their design basis reconstitution.  They used site-
           specific and applicable industry operating experience,
           and they also used generic communications.  The
           applicant also incorporated lessons learned from the
           Oconee scoping review.  The chilled water system,
           skid-mounted equipment, structural sealants, ANO 1
           ventilation sealants, water stops, expansion joints,
           electrical cables, fire-detected cables, and buried
           pipe were all not excluded from the aging management
           review in the original ANO 1 license renewal
           application.
                       ANO 1 aging effects discussed and accepted
           by the staff were consistently applied by the
           applicant based on Appendix C of the license renewal
           application, as discussed previously.  And corrective
           actions, ANO 1 committed to 10 CFR Part 50, Appendix
           B for all license renewal corrective actions, safety-
           related and non-safety-related both.  That includes
           corrective actions, the confirmatory process, and
           document control activities.
                       As far as the open items for scoping,
           initially the applicant did not identify a flow
           control orifice -- I'm sorry, the applicant did not
           identify flow control as an intended function of an
           in-line orifice that controlled the injection of
           sodium hydroxide for pH control.  In resolution to
           this item, the applicant did include the flow control
           function.  And because the orifice is made of
           stainless steel and is subject to cracking, the
           applicant added the orifice to the inspection program
           used to manage other stainless steel components within
           the sodium hydroxide system as their resolution.
                       The second item was fire protection. 
           There were five sets of components that the staff was
           concerned about.  They were the fire protection jockey
           pumps, the carbon dioxide system, fire hydrants, the
           water supply to the low level radwaste building fire
           protection system, and the piping to the manual hose
           station as being within the scope of license renewal
           and subject to an aging management review.  The
           applicant took the position that it was never part of
           the current licensing basis, these components.  And
           the staff felt that it was necessary to include them
           based on the rules under Part 50.
                       We had a number of meetings on these
           items.  What the final resolution was was that the
           applicant realized that even though it wasn't part of
           their initial current licensing basis, that the fire
           protection jockey pump and the fire hydrant should be
           included within the scope of license renewal.  And
           they did include it, performed an aging management
           review, and identified aging management programs for
           those components.
                       MR. LEITCH:  When you refer to the fire
           protection jockey pump, are you speaking specifically
           of the casing?
                       MR. PRATO:  Just the casing; yes, sir.
                       MR. LEITCH:  Just the casing.  Okay, I
           understand.  Thank you.
                       MR. PRATO:  As for the other three items,
           based on the applicant's presentation to the staff,
           the staff found that these components were not
           required to be included within the scope of license
           renewal, and therefore this item was close. 
           Initially, when we started this review and we
           identified these differences, we thought we had
           potentially a Part 50 item, because it wasn't part of
           the licensing basis.  But based on the resolution,
           because both the staff and the applicant agreed what
           should have been included and what shouldn't have
           been, it did not even end up as a Part 50 item.
                       As for the aging management review, aging
           effects, the applicant addressed void swelling in the
           reactor vessel, reduction in fracture toughness of the
           reactor vessel internal task components by thermal
           embrittlement and irradiation embrittlement, cracking
           and loss of material of letdown cooler tubings, loss
           of material for external Ferritic surfaces due to
           boric acid wastage, irradiated-assisted stress growths
           and cracking for baffle bolts, and cracking of reactor
           vessel internal non-bolted items as applicable aging
           effects.
                       As for intended functions, the applicant
           did include heat transfer as an applicable intended
           functions for heat exchanges.  These things were
           already included in the aging management program in
           the initial license renewal application as lessons
           learned from Oconee.
                       As for the aging management review, they
           performed an aging management review on all the
           service water piping, including the copper, brass, and
           ductile iron, et cetera, all the materials that are
           within the scope of the license renewal.  But they did
           not perform an aging management review of the tendon
           gallery in the license renewal application, consistent
           with the staff's conclusion on the Oconee application
           review.  They did not perform an aging management
           review of the pressurized spray head, contrary to
           Oconee, which did end up performing an aging
           management review of the spray head.  ANO 1 does not
           use it for their accident analysis at all, and
           therefore it was not within the scope.
                       As for aging management, the applicant
           used performance monitoring consistent with Generic
           Letter 8913 for managing filing in the service water
           system.  Cracking of Alloy 600 and Alloy 82/182 will
           be monitored during the period of extended operation. 
           And aging of small-bore piping will be managed by
           risk-informed methods used to select reactor coolant
           system piping welds for inspections.  These are all
           differences between ANO 1 and Oconee.
                       DR. BONACA:  This is an existing program?
                       MR. PRATO:  Excuse me, sir?
                       DR. BONACA:  Is this small-bore piping
           management risk-informed --
                       MR. YOUNG:  Yes.  It's a fairly recent --
           it was a change.  We just, in the last couple years,
           switched to the Risk-Informed In-Service Inspection
           Program, and that was when we included the small-bore
           piping at that point.
                       MR. PRATO:  That's been reviewed and
           approved by the staff as well --
                       MR. YOUNG:  Right.
                       MR. PRATO:  -- independently of this
           effort.
                       DR. SHACK:  Okay.  So you had the small-
           bore piping when you did go to the risk-informed
           inspection.  You included it rather than as part of
           the license renewal, it was actually --
                       MR. YOUNG:  Right.  Right.  Right.  We had
           already gone to the small-bore piping inspection as a
           result of the risk-informed ISI, which was prior to
           doing our license renewal review.  So we're able to
           take credit for that.
                       DR. BONACA:  Why would you do that, I
           mean, technically?  Some other applicants claim that
           they don't need to inspect small-bore piping.
                       MR. YOUNG:  Well, if you haven't gone to
           the risk-informed ISI, then you would not include the
           small-bore piping under Section 11 requirements.  They
           currently do not require you to do a volumetric-type
           inspection, just a visual inspection.  But during the
           risk-informed review, and that's very plant-specific,
           we did identify some locations of piping welds that
           met the criteria for both risk and susceptibility that
           we did include them for doing volumetric inspections. 
           So I don't think very many plants have gone to risk-
           informed ISI yet is part of the reason for the issue.
                       DR. BONACA:  But given your findings,
           wouldn't that suggest that maybe one-time inspection
           for other applicants is not sufficient?
                       MR. KUO:  Well, there's -- as you know, in
           the GALL report right now, that we do require one-time
           inspection for the small-bore piping, but this issue
           is continually under review.  And I believe that in
           the industry they have also MRP Program that also uses
           the risk and they have concluded that something should
           be done.  And they are about to make recommendations
           to code body.  So if this materializes later on, the
           staff will certainly incorporate lessons learned from
           these activities.
                       DR. BONACA:  Thank you.
                       MR. PRATO:  Okay.  As for the open items
           identified during the first safety evaluation, there
           were two for the aging management review.  The first
           one was a summary of 11 different aging management
           programs that needed additional information to be
           included in the FSAR supplement.  Each one of those 11
           items are identified in the attachment on the back and
           the additional information that they agreed to put
           into the FSAR supplement.
                       If you get an opportunity to look at the
           operating license, we do not have a license condition
           for the FSAR supplement.  The reason is, is based on
           the findings of this Committee, at that point, the
           applicant has agreed to incorporate the supplement
           into the FSAR prior to the Commission decision.  So an
           open item license condition wasn't needed at that
           point.  So that supplement will be part of their FSAR
           prior to the Commission making their decision and
           issuing the new license.
                       The other open item, the applicant did not
           identify an aging management program for buried,
           inaccessible medium-voltage cables exposed to
           groundwater that are within the scope of license
           renewal and subject to an aging management review. 
           When we identified this, the applicant looked at their
           aging management review and incorporated it.  As a
           resolution, they offered something a little bit more
           unique than Oconee.  They offered to do either what
           Oconee did, which is to do some sort of a measurement
           on the cabling to try and identify if the installation
           is breaking down and to monitor the water that these
           cables are exposed to.  Or they will do a periodic
           replacement of those cables.
                       The reason they chose to take that second
           option is because they've had three failures on-site,
           and each time they did do Megger testing not too long
           before the failure had occurred.  And if something is
           not developed that would accurately identify
           degradation of the installation far enough in advance
           so that they could prevent the failure from happening,
           they agreed to just go through a periodic replacement
           based on plant-specific and industry experience.
                       Did I explain that accurately, Garry?
                       MR. YOUNG:  Yes.  Right now we're
           evaluating basically the qualified life of this buried
           cable.  It's non-EQ, obviously.  It's outside of the
           EQ Program because it's not in a harsh environment
           relative to EQ.  And we have had some failures.  So
           we're looking at now determining whether or not we can
           come up with a qualified life based on operating
           experience that would warrant just doing a periodic
           replacement or do the inspection.  As the inspection
           results get better, we may choose to use inspections. 
           Or if they don't get better, we may choose to do
           periodic replacement.
                       DR. BONACA:  This issue, too, will have
           some generic implications?
                       MR. KUO:  Yes, sir.
                       DR. BONACA:  As to the adequacy of just
           simply doing a measurement?
                       MR. KUO:  Yes.  We certainly would take
           note of that, and we will incorporate any lessons
           learned from this later on.
                       DR. BONACA:  The reason why I'm raising
           this issue is that we see a number of applications
           coming through with different Project Managers.  It's
           not clear how these lessons learned are shared among
           the different project reviews.
                       MR. KUO:  Well, in fact, there is -- we
           have an office letter 805 that describes or detailed
           all the procedures that we have followed.  So we hope
           that these kind of lessons learned will be
           incorporated into the official reviews rather quickly.
                       DR. BONACA:  Clearly, for us, it would be
           more difficult in the next application to accept just
           the measurement of the buried cable as a means of
           identifying --
                       MR. KUO:  Well, these issues, like a one-
           time inspection for small-bore piping and the buried
           cables, are all really issues of contention.  It's
           constantly under review, and we certainly will take a
           continuous look at it.
                       DR. BONACA:  And GALL certainly applies
           for this will be documented.
                       MR. KUO:  Yes, sir.
                       DR. BONACA:  And that's why we've asked
           for frequent updates.
                       MR. KUO:  Yes, sir; we agreed to that.
                       DR. UHRIG:  Would these cables be actually
           replaced or would there just be a new cable put in
           parallel and the old one left in place?
                       MR. YOUNG:  They'll probably be replaced. 
           They're in conduit underground, so they would just be
           pulled out.
                       DR. UHRIG:  They can be pulled?
                       MR. YOUNG:  Yes.
                       DR. UHRIG:  Okay.
                       MR. PRATO:  During the inspection process,
           shortly before we did the aging management review
           inspection, they had their third failure.  And they
           had the cables out on the grounds, and we took a look
           at it.  We also found out at that time that they tried
           to do analysis to find the root cause of the previous
           two failures without any success.  The root cause
           analysis, the laboratory analysis, was unable to
           identify the specific mechanism that failed.
                       DR. UHRIG:  Was there moisture in the
           pipes when you pulled the cable out?  Was there
           evidence that there was moisture in there?
                       MR. YOUNG:  There was evidence of
           moisture, yes.  Yes.  Part of the problem we're having
           is that the inspection of the cables is not conclusive
           as to the reason for the failure.  It could have been
           a manufacturing defect that was originally in the
           jacket or it could have been some sort of aging
           mechanism.  But by the time they get them to the
           laboratory for inspection, they haven't been able to
           conclusively identify the root cause.
                       MR. LEITCH:  The testing program you're
           referring to is the Megger Program; is that right?
                       MR. YOUNG:  Yes.  The industry currently
           is evaluating options for testing, but right -- what
           we used was a Megger test.  But through EPRI and
           through some industry efforts, they're looking at some
           other options for maybe other ways to test.
                       MR. LEITCH:  Right.  And it was shortly
           after the Megger, if I understood you correctly, that
           these failures occurred?
                       MR. YOUNG:  Yes.  Probably within 12
           months or so of the previous inspection we had the
           failure, the most recent failure.
                       MR. LEITCH:  Thanks.
                       MR. KUO:  Dr. Bonaca, for the record, I
           just want to correct what I said earlier.  I was
           informed by Mr. Paul Shemanski that the issue actually
           has been copied in the final version of the GALL. 
           I'll let him explain it to you.
                       DR. BONACA:  Okay.
                       MR. SHEMANSKI:  Well, basically, we took
           the information -- actually, this issue started back
           in October of '99, I believe, with the Davis-Besse
           event where medium-voltage cables on the service water
           systems catastrophically failed due to moisture
           intrusion.  These were cables that were in four-inch
           PVC pipes underneath the turbine building floor and
           somehow -- we believe it to be groundwater -- got in. 
           And over time, that water actually migrated through
           these 4160 volt cables into the insulation, resulting
           in ultimate dielectric breakdown.
                       And as such, we took that information and
           the information from Arkansas.  We have incorporated
           that into GALL.  It's in there under aging management
           program for medium-voltage cables, subject to
           significant moisture and voltage.  And we do even
           recommend several tests that might be considered. 
           These are actually used by Davis-Besse -- the partial
           discharge test and power factor test.  We found those
           are more sensitive.  Megger is too gross a test to
           detect insulation degradation.  So I think we've
           captured the operating experience in GALL -- well, I
           don't think we have, so we're comfortable licensees,
           future applicants, will be aware of this issue.
                       DR. BONACA:  Thank you.
                       MR. KUO:  And I also would like to mention
           that the April 2001 version of the GALL has been
           released to the public.
                       MR. PRATO:  Okay.  Time-limited aging
           analyses fatigue.  The applicant considered cumulative
           effects of fatigue for the containment liner plate in
           penetrations, and the reactor coolant system
           environmental assisted-fatigue, consistent with GSI-
           190 in the license renewal application initially.
                       As for fractured toughness, the applicant
           considered fractured toughness related to the
           acceptability of reactor vessel internals under loss
           of coolant and seismic loads in its reactor vessels
           internal aging management program, consistent with the
           topical report, BAW 2248.
                       For flaw growth, the applicant considered
           flaw growth in accordance with the ASME boiler and
           pressure codes, Section 11 of ISI requirements in the
           license renewal application, consistent with the
           topical report, BAW 2248.
                       For neutron embrittlement of the reactor
           vessel, the applicant performed analysis to evaluate
           the impact of neutron embrittlement on reactor vessel
           integrity.
                       DR. BONACA:  I have a question regarding
           the specimen for the vessel.  It wasn't clear to me
           reading the application, you have specific specimens
           for your vessel, Arkansas One.
                       MR. YOUNG:  Yes.  I may need to get with
           Mark here.  I think the specimens for the Arkansas
           vessel I don't believe are in the Arkansas vessel
           anymore.  I think they're in another --
                       MR. RINCKEL:  That's right.  Mark Rinckel,
           Framatome.  Yes, they are being irradiated in Crystal
           River 3 and Davis-Besse Unit 1.  And they're part of
           the integrated program, which is a MIRVP.
                       DR. BONACA:  Thank you.
                       MR. PRATO:  Pressurized thermal shock. 
           The applicant performed an analysis to the criteria in
           10 CFR 50.64 and Sharpy upper shelf energy analysis to
           Appendix K of the ASME code for the end of the period
           of extended operation.
                       Containment pre-stress tendons.  Concrete
           reactor building tendons pre-stress will be managed
           during a period of extended operation using ASME
           Section 11, IWL In-Service Inspection Program.
                       DR. BONACA:  Was this an open item?
                       MR. PRATO:  Yes, sir.
                       DR. BONACA:  Yes, it was.
                       MR. PRATO:  Yes.
                       MR. YOUNG:  Yes.  The issue here was in
           the original application we provided just the
           description of the ASME Program, but there was some
           additional monitoring that the staff wanted to see the
           results or the information regarding.  And it was
           really, I think, more of a miscommunication.  We were
           misunderstanding what the question was, and the by the
           time we got to the open item, we finally got down to
           the details and were able to provide the needed
           information.
                       DR. BONACA:  Yes.  You needed to develop
           curves, if I remember.
                       MR. YOUNG:  Yes.  Right.
                       DR. BONACA:  Okay.
                       MR. PRATO:  For reactor building liner
           plate fatigue analysis, the applicant demonstrated
           that the original fatigue analysis is valid for the
           extended period of operation.  For the reactor vessel
           underclad cracking, fracture mechanic analysis
           indicated that the reactor vessel will have adequate
           fracture resistance through the period of extended
           operation.  And for the reactor vessel in-core
           instrumentation nozzles, flow-induced vibration on
           reactor vessel in-core instrumentation nozzles have
           been projected to the end of the period of extended
           operation.
                       DR. UHRIG:  Is that a movable system or is
           that a fixed system?
                       MR. YOUNG:  Mark?
                       MR. RINCKEL:  Mark Rinckel of Framatome
           again.  The nozzles that they're referring to are
           fixed and attached to the bottom of the head.  We
           don't have a system like Westinghouse does with the
           thimble tube.  Our in-cores are actually exposed to
           the reactor coolant, and they move within the guide
           tube and through the nozzles and up into the fuel
           assembly.
                       DR. UHRIG:  It's not like the Crystal
           System.
                       MR. RINCKEL:  Crystal River is a B&W
           plant.  It is, yes, yes.
                       DR. UHRIG:  Is the in-core instrumentation
           essentially the same?
                       MR. RINCKEL:  Yes.  The in-core
           instrumentation is, but there's not a separate thimble
           tube or pressure boundary.  I mean the in-core itself
           is exposed to the reactor coolant, and it's made of
           different material.  The stainless steel guide tube
           goes from the seal table to the bottom nozzle of the
           -- and the nozzle is attached to the vessel.  And then
           it runs from there up through the internals and up
           into the fuel assembly.
                       DR. BONACA:  Do you inspect these nozzles
           on a periodic basis?
                       MR. RINCKEL:  The nozzles will be
           inspected from the outside in accordance with Section
           11.  It would be a VT-3 -- I believe VT-3 or VT-2
           inspection.  And then from the internal, it would be
           during when they pull the reactor vessel internals
           out.  So you would look at both from the outside and
           the inside.
                       DR. BONACA:  That would be once every --
                       MR. RINCKEL:  That is correct, yes.
                       DR. BONACA:  And I guess they're less
           acceptable?
                       MR. RINCKEL:  Yes, they are.  In fact,
           those things, if you remember from your history, they
           were repaired.  They initially broke off at Oconee
           Unit 1, and then they were beefed up and repaired at
           all of our plants.
                       DR. BONACA:  Thank you.
                       DR. SHACK:  The wall thing isn't
           explicitly included at a time-limited aging analysis
           here; is that correct?  It's not treated as a time-
           limited aging analysis?
                       MR. YOUNG:  Right.  We went back and
           evaluated whether or not we had any corrosion
           allowances or wall thinning that was based on time-
           limited aging analysis, and we did not find any in our
           documentation that took credit for that.  So those
           were not identified as TLAAs for Arkansas.
                       DR. SHACK:  So in your Flow-Assisted
           Corrosion Program, you have no measurable thinning in
           your feedwater piping?
                       MR. YOUNG:  No.  No, no.  Okay.  That
           falls in the category of being an aging effect, so
           that is included -- that was identified as an aging
           effect when we did the system reviews.  And we did
           identify the FAC Program as being the program that
           manages that.  The TLAAs were strictly the analytical
           evaluations that were done in the original safety
           analysis to determine the safety of the plant.  So if
           we had had an analysis that showed that we had a
           corrosion or an erosion/corrosion allowance that was
           valid for 40 years, then we would have evaluated here
           to extend it to 60 years.
                       DR. SHACK:  But doesn't the flaw growth
           TLAA include flaws that you would find after -- that
           weren't considered in your original design and then
           you project that life?
                       MR. YOUNG:  Yes.  You're right, yes.  For
           flaws, any time we identify a flaw then we do an
           evaluation for the remaining life of the plant.  And
           those, too, were identified as TLAA.  So, you're
           right, those get identified after the original design.
                       DR. SHACK:  Why wouldn't wall thinning be
           in the same category as the flaw that you find?
                       MR. YOUNG:  We didn't do any analysis to
           project that the walls would remain in tact for the
           life of the plant.  When we did the evaluation for the
           FAC Program, we determined that we in fact needed an
           aging management program, not an analytical analysis,
           to show that it would go the life of the plant,
           because in fact it won't.
                       DR. SHACK:  Okay.  But you mean you do an
           analysis to show that it will go till the next
           inspection.
                       MR. YOUNG:  Yes.  Right.  But those are
           not classified as TLAAs because -- right.
                       MR. PRATO:  One of the criteria for TLAAs
           is that it's projected to the current operating term.
                       MR. YOUNG:  Right.
                       MR. PRATO:  And that brings us to our open
           items.  We talked briefly about pre-stress tendons. 
           There were a number of different graphs that needed to
           be developed, and the applicant provided that prior to
           the final SE.  And the staff found that acceptable. 
           And the second item was the Boraflex Monitoring
           Program.  This is kind of interesting in that the
           applicant initially provided a program similar to
           Oconee.
                       From the time they submitted their
           application to the time that the staff developed a
           request for additional information, they took some
           additional data on that monitoring program.  And they
           found that the -- when they plotted that data, they
           found that the boraflex would not last through the
           current operating term.  As a result, it ended up
           being a TLAA.
                       Under Part 50, they're required to
           maintain a sub-critical margin, and if they can't
           maintain that sub-critical margin, they have to submit
           a plan to the staff for their review and approval.  So
           they felt that it did not belong under Part 50.  And
           the staff reviewed the definition under Part 54 for
           TLAA and concurred, because it is supposed to be for
           analysis that are projected to year 40.
                       Design Engineering Management did not feel
           comfortable in that resolution, removing boraflex as
           a TLAA.  As a result, we spent some time with OGC, and
           OGC concurred with DE Management and said it does not
           necessarily have to be eliminated just because recent
           analysis shows it's not going to make it to the 40
           years.  So what the staff requested is that the
           applicant keep the program in place, the monitoring
           program in place until the resolution has been
           identified and that the boraflex life and the ability
           to maintain sub-critical margin can be established out
           through the period of extended operation.
                       DR. BONACA:  You do have boraflex only in
           one region of your pool.
                       MR. YOUNG:  Yes, that's right.
                       DR. BONACA:  And in the other regions, you
           have Boral or some other material?
                       MR. YOUNG:  I'm not totally up to speed on
           the details of our spent fuel pool, but we --
                       DR. BONACA:  But there's no boraflex.
                       MR. YOUNG:  Right.  We do have some
           regions that have the boraflex and some that do not.
                       DR. BONACA:  Do you already have a plan on
           how you're going to get rid of the boraflex?
                       MR. YOUNG:  We're developing that plan
           right now.  As Bob mentioned, the finding was fairly
           recent, and there are several options to correct the
           situation, and those are being evaluated.  And
           probably within the next two years, we're going to
           wind up with a recommendation to take some action with
           either a different material or --
                       DR. BONACA:  So you still have flexibility
           in your pool to move those assemblies in some
           different location as you --
                       MR. YOUNG:  Yes.  We still have some room
           in the pool for moving the fuel around, yes.
                       DR. BONACA:  Okay.  All right.
                       MR. PRATO:  The next slide, slide 20, is
           just a list of the aging management programs.  If
           anybody has any particular question on any of the
           aging management programs, I'll be glad to answer them
           at this point.
                       DR. BONACA:  I would like to go back a
           moment to the CRDM casings.  And the question I have
           is -- I know that Oconee has already committed to
           repairs.  Essentially, the repairs include re-welding. 
           The question I have is, is the material being used for
           welding over?  And I'm sure that Arkansas has some
           plan of that nature too.  Is it going to be less
           susceptible to the same kind of failures?  I guess
           what I'm driving at is are these steps that are being
           taken now to repair those cracks going to be -- are
           they being viewed as a permanent repair that should
           not be affected anymore by this phenomenon or is it
           going to be simply another time-limited repair?
                       MR. YOUNG:  Well, I think the answer to
           that is it's still being evaluated.  And I know the
           repairs that were done at Arkansas were different than
           the repairs that were done at Oconee, but I think it's
           part of this evolving process and analysis of what is
           the correct solution, where do we need to go from
           here.  I think in the case of Arkansas, the repairs
           were done with the information that was available at
           the time, which was just within the last couple of
           months.  And they're continuing to do the analysis on
           the findings to determine if -- well, first of all, it
           will change our inspection program.
                       DR. BONACA:  Sure.
                       MR. YOUNG:  So that's definitely a change. 
           And then it may require some subsequent repair actions
           or preventive actions based on the results of those
           analysis.  But that's still being evaluated.
                       DR. BONACA:  I guess what I'm driving at
           is that ultimately the measure of success of the
           program is going to be the ability of preventing an
           occurrence to happen again.  And so right now really
           we don't know if these kind of repairs are going to be
           effective to do that.  I mean we don't, I guess.
                       MR. YOUNG:  That's my understanding.  Now,
           there may be some people here from the staff or from
           Framatome that know more about the details of the work
           that's been done so far.  But I know at Arkansas we're
           fairly early into the analysis.  And like I said, we
           just finished the outage in which we found the
           problem, so I know there's still a lot of work going
           on in that area.
                       DR. BONACA:  I understand that some of the
           materials are being changed, so there is some
           expectation that those changes in materials should
           lead to a different kind of performance, although we
           cannot right now estimate whether or not they will
           prevent these kind of failures from occurring again. 
           And so you have to rely on future inspections.
                       MR. YOUNG:  Right.  And I think, as
           mentioned earlier, the Materials Reliability Program,
           the industry program, is looking at this as well to
           see what changes are needed throughout the industry
           relative to this type of problem.
                       MR. PRATO:  Before I go into the
           conclusion, are there any other questions?
                       MR. LEITCH:  Earlier there was an
           indication that there were 22 existing programs, and
           here there are 28 listed.  Is that just a different
           bean count or is there some significance to the
           difference in those numbers?
                       MR. YOUNG:  Again, the way we count the
           programs is somewhat difficult, because it's a bean
           count issue.  We all have the same list of programs,
           but in the application itself we would have a section,
           and then it would have an A, B, C part.  So it depends
           on whether you count the A, B, C part or just the
           headings.
                       MR. LEITCH:  Okay.
                       MR. YOUNG:  That's really where we're at
           on that.
                       MR. LEITCH:  Thanks.  Just one other
           minor, very minor, comment.  In the SER Chapter 5,
           there's a section about presentation to the ACRS, and
           the number of the ACRS meeting at which those
           presentations occurred is incorrect.
                       MR. PRATO:  I'll verify that.
                       MR. LEITCH:  It's just a nit.
                       DR. BONACA:  That's a good point.  I mean
           this is the first application for which we have not
           had an interim full Committee meeting.  And, of
           course, as I mentioned before, there are good reasons
           for that.  One was the low number of open issues
           identified, and we agree with the staff that there
           were no additional ones.
                       Second, the fact that there was a lot of
           lessons learned, and we actually asked the staff to
           articulate the presentation on the basis of comparison
           to the previous ones so that we could understand
           whatever we accepted the program for Oconee, then the
           program should be acceptable for Arkansas, unless
           Arkansas presents a better program, which in some
           cases did.
                       And the reliance on the standard
           application format, actually striving for it.  The
           work that Arkansas did with the NRC I think was very
           helpful, and the reliance on the guidance of NEI 95-10
           made the application, I think, much easier.
                       And I point it out because we have been
           trying to have some demonstrations from repeated
           applications that in fact ultimately the guidance
           documents and the endurance of the guidance documents
           and previous experience will facilitate the review and
           improve the applications.  And we, I think, have proof
           here in front of us.
                       MR. PRATO:  Any other questions?
                       Okay.  In conclusion, on the basis of the
           staff's review of the license renewal application and
           the applicant's response to the request for additional
           information and resolution to the open items, as
           documented in the safety evaluation report, the staff
           found that, one, the applicant has appropriately
           identified the aging mechanisms associated with
           passive, long-lived structures and components, as
           required under 10 CFR 54 and 10 CFR 54.21(a).
                       Two, the applicant has instituted the
           programs needed to manage age-related degradation of
           these structures and components such that there is
           reasonable assurance that ANO 1 can be operated in
           accordance with its current licensing basis for the
           period of the extended license without undue risk to
           the health and safety of the public.
                       And three, the applicant has analyzed the
           time-limited aging analysis associated with ANO 1,
           consistent with the requirements of 10 CFR 54.21(c).
                       On the basis of these findings, Region 4's
           verification of these activities, and the Regional
           Administrator's recommendation, the staff requests
           that the ACRS provide the Commission with a favorable
           recommendation on the renewing of the ANO 1 operating
           license for an additional 20 years of operation.  And
           that concludes our presentation for today.
                       DR. BONACA:  Okay.  Any questions from the
           members?  Any perspectives you want to share regarding
           the application and the SER?  If none, I would like to
           thank the staff, Mr. Prato and Mr. Young, for well-
           informed presentations.  And I would like to also,
           again, recognize Arkansas for an application that
           facilitated that review.  And I think it's been quite
           effective.  And with that, I thank you very much, and
           I --
                       MR. KUO:  And this concludes the staff's
           conclusion.  And what I would take back, I think,
           there are three points here that we're going to check
           SER Section 5 and correct, if possible, the
           discrepancy in the numbers of the ACRS meetings.  And
           the second one is we will monitor the progress of
           aging management for both the CRDM cracking issue and
           the small-bore piping issue.
                       DR. BONACA:  Small-bore piping, yes.
                       MR. KUO:  And with that, of course, we
           will recommend that ACRS write a letter to the
           Commission for approval of the --
                       DR. BONACA:  We will write a letter.
                       MR. KUO:  Thank you.
                       DR. BONACA:  Okay.  Thank you very much. 
           And with that, Mr. Chairman --
                       CHAIRMAN APOSTOLAKIS:  Thank you very
           much.  We were told that the review of the application
           was completed eight months ahead of schedule?
                       MR. PRATO:  Yes, sir.
                       CHAIRMAN APOSTOLAKIS:  And Dr. Bonaca
           completed his presentation half an hour, actually --
           half an hour before schedule.  There must be something
           going on with license renewal issues.
                       (Laughter.)
                       We probably overestimated what it takes to
           review those.  Thank you very much, gentlemen;
           appreciate it.
                       MR. YOUNG:  Thank you.
                       CHAIRMAN APOSTOLAKIS:  As the members
           know, we will meet again at 10:30 in the
           Commissioners' Room to attend to the Commission's
           meeting on nuclear research with Dr. Powers and Dr.
           Wallis leading the charge on behalf of the Committee.
                       Thank you very much, and we'll see you
           back here at 1:30.
                       (Whereupon, the foregoing matter went off
                       the record at 9:55 a.m. and went back on
                       the record at 1:30 p.m.)
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           .                     A-F-T-E-R-N-O-O-N  S-E-S-S-I-O-N
                                                    (1:30 p.m.)
                       CHAIRMAN APOSTOLAKIS:  We're back in
           session.  The next item on the agenda is a
           presentation on risk-based performance indicators. 
           Mr. Mays, the floor is yours.
                       MR. MAYS:  Thank you, George.  Good
           afternoon.  It's a pleasure to be back here before the
           ACRS to discuss our work on risk-based performance
           indicators.  This presentation will be an abbreviated
           version of what we presented to the Subcommittee last
           month.  The Subcommittee asked us to concentrate the
           proposed shutdown, performance indicators, the
           validation and verification, including comparison with
           the current reactor oversight process PIs, and the new
           alternative approaches for risk-based performance
           indicators that we've developed in response to
           internal and external stakeholder comments.
                       So as we did at the last meeting, our
           counterparts from NRR are here to briefly explain the
           relationship between the RBPIs and the reactor
           oversight process.  And the rest of the presentation
           will be our summary of the work that we did to
           establish the technical feasibility of risked-based
           performance indicators as a potential enhancement to
           the ROP.
                       We're seeking a letter from the ACRS
           addressing whether you see this work as potential
           benefit to the reactor oversight process, whether you
           think our technical approach is feasible, and whether
           you think we should continue to expand and/or add the
           proposed alternative approaches to the Phase 1 report. 
           We issued the Phase 1 report in January.  You have had
           it for a few months now, and we're going to look
           forward to see what comments you have from that.
                       Now, Tom Boyce, from NRR, who works in the
           Inspection Program Branch, is here to go over the NRR
           view of the interrelationship between the RBPIs and
           the reactor oversight process.
                       MR. BOYCE:  Thank you, Steve.  As stated,
           I'm Tom Boyce.  I'm the Inspection Program Branch of
           NRR.  You heard about the Reactor Oversight Program
           yesterday.  I'm a member of the Branch who is
           responsible for that oversight process, and we would
           be the people who would be the users of the risked-
           based PIs.
                       I wanted to start just by talking about
           some of the environment surrounding the risk-based PIs
           and the direction we're going.  In the Commission PRA
           policy statement and in their strategic plan, the
           Commission articulated its intent to move in a more
           risk-informed direction, and we think these risk-based
           PIs are clearly a step in that direction.  We also
           wanted to point out that the current reactor oversight
           process is a significant step in that direction.  We
           think it's much more risk-informed, objective,
           understandable, and predictable than the previous
           oversight process that was in place.
                       We also wanted to point out that industry
           and the NRC have been responsive to larger movements,
           advances in information technology, and the collection
           of data is improving, the transmission of data is
           improving through the use of the Internet and personal
           computers.  And the PRA models, specifically the SPAR
           models under development by the NRC and the PRA models
           that licensees are using, have continued to improve. 
           And so against that backdrop, it's more ripe for risk-
           based PIs than we've had at any time in the past. 
           Next slide.
                       DR. POWERS:  May I ask a question?  The
           industry, when it does risk assessments, it gets a
           certification from an industry group for the PRA that
           it uses.
                       MR. BOYCE:  The question is do they get a
           certification?
                       DR. POWERS:  Well, I believe they do.
                       MR. BOYCE:  Okay.
                       DR. POWERS:  And what I'm asking is what
           is the equivalent for the SPAR models?
                       MR. MAYS:  Let me answer that, Dana.  The
           SPAR models, the Ref 3 models that we're using for
           this program, we have instituted a process by which
           they get reviewed by the contractor and by us as
           they're being done.  They're reviewed internally by
           NRC personnel when we get them.  And we also have a
           process for doing on-site reviews where we go to the
           plants and look at the as-built, as-designed plant and
           what they've done in their PRAs to identify if there
           are any shortcomings that we've had in that.
                       In addition, we've been using the SPAR
           models for several years now in the Accident Sequence
           Precursor Program.  So whenever we evaluate the risk
           significance of an event or condition at a plant using
           those models, we would send those out formally to the
           licensees to review, and we were getting feedback and
           comment on those during that process as well.
                       DR. POWERS:  So you really don't have what
           I would call an independent review.  And I'd invite
           Dr. Wallis to comment on his experience with people
           saying, "Gee, we've used a code for several years, so
           it must be right."
                       MR. MAYS:  Well, that wasn't the statement
           I made, but I was saying we have had the opportunity
           to get feedback from licensees about the validity of
           models we've used through the ASP Program.  That's not
           a complete review, but it is more than nothing.  And
           we are going to every site for the models that we're
           developing to have those reviewed by the folks on-
           site.  So I think we have a pretty substantial process
           for being able to do that.
                       CHAIRMAN APOSTOLAKIS:  Wouldn't it be a
           good idea, after the Agency agrees to some form of an
           ASME standard, to apply the standard to SPAR?
                       MR. MAYS:  If we were to come up with a
           standard, I think that would be appropriate.
                       DR. WALLIS:  I hate to use the word
           "independent" that was used this morning, but wouldn't
           it useful to have also some independent check on these
           things from -- I don't know where it would come from,
           but just between you and the licensees, I'm suggesting
           someone else might who was not so tied up in the
           process be able to contribute to improving or
           detecting --
                       CHAIRMAN APOSTOLAKIS:  At this point,
           having the licensees review them may be good enough.
                       DR. POWERS:  How can you possibly say
           that?  I don't think that would inspire public
           confidence if I were a member of the public.
                       CHAIRMAN APOSTOLAKIS:  Still, the program
           is under development.  The numbers are not being used
           in any real way by the Agency, right?  And they have
           30 SPAR models.  They plan to have, what, 70 more or
           40 more?  I mean it's really important at this point
           to make sure that their SPAR model for a particular
           plant is not off the mark.  So by having the licensee
           review it, you get that assurance.  But they are not
           really taking any regulatory actions yet, because
           eventually when they have the totality of the 70 SPAR
           models, then they will have to think about how to have
           maybe an independent review panel or somebody.
                       DR. POWERS:  Well, I can see this now. 
           You're going to bring a review panel and say, "Here,
           review 70 models."
                       CHAIRMAN APOSTOLAKIS:  So you plan to have
           established a review panel that will be reviewing them
           as they are produced?
                       DR. POWERS:  I don't know.  That sounds
           like a good idea to me.
                       CHAIRMAN APOSTOLAKIS:  And the other point
           is that let's not overexaggerate the value of these
           review panels.  I mean I can't imagine that they will
           do a review like Sandia did for Indian Point and Zion
           PRAs.  I mean these panels will probably look at the
           overall approach, how did you do common cause
           failures, how did you do data analysis.  But otherwise
           it's a huge job.  I mean it's huge now.  It's huge in
           the future.
                       DR. POWERS:  And it seems to me that the
           pattern for the review has been set by the Agency in
           the kind of reviews that is applied to the codes
           that's developed for severe accident analysis.
                       CHAIRMAN APOSTOLAKIS:  Can you elaborate
           on that?
                       DR. POWERS:  They do a fairly detailed
           review.  The panel actually exercised the models. 
           They have a process set out, begins with are the
           intentions of the model.  They do a top-down, then a
           bottom-up examination.  They publish a report that has
           their complaints about the -- their comments on the
           models and includes the response from the model
           developers.
                       CHAIRMAN APOSTOLAKIS:  But here you're not
           talking about a model that will be used to produce
           results under different conditions.  Here you are
           talking about PRA for a particular unit.  So, you
           know, that approach will have to adapted to this
           particular problem.
                       DR. POWERS:  Well, I think it's adapted to
           every curve, but it seems like a pretty good approach
           to me.
                       DR. SHACK:  And it will be independently
           checked by the PRA that the licensee has.  I mean,
           obviously, I think if the licensee is getting
           different results, then you're certainly going to hear
           about it.
                       DR. POWERS:  What I would worry about is
           if you've overlooked some vulnerability in the SPAR
           model that the licensee has overlooked, and the item
           that comes promptly to mind is induced station
           blackout.
                       MR. MAYS:  I guess the question I would
           have is who would have that level of knowledge outside
           of us or the plants to be able to conduct that kind of
           an in-depth review?
                       DR. POWERS:  Another plant.
                       DR. WALLIS:  Maybe that's right.  Someone
           who's done it himself knows the ins and outs, knows
           the traps --
                       CHAIRMAN APOSTOLAKIS:  Do you think that
           will contribute to public confidence to tell them that
           San Onofre was reviewed by Diablo Canyon?
                       DR. POWERS:  Well, I think if I was to
           formulate a panel, I would probably draw from a cross
           section of the community; that is, I would look to
           somebody with experience from the nuclear industry
           with a similar type plant, someone from academia,
           maybe even sophomore at Dartmouth.  Well, they seem to
           be very knowledgeable individuals.  And maybe somebody
           from the PRA specialist community.  Budd Boyack's not
           a bad choice.
                       CHAIRMAN APOSTOLAKIS:  I don't think that
           when you say that you appreciate the magnitude of this
           effort.  I'm not against a review, but just to say,
           "Have these models reviewed," I mean we can start by
           reviewing SAPHIRE, for heaven's sake, and apply what
           you said earlier about the severe accident goes to
           SAPHIRE, which is the basis for the models.  Let's do
           that first and then we can have a panel and so on. 
           And then go to the individual SPAR models and make
           sure that we have a practical approach.  That's all
           I'm saying.  I mean just to ask, "Have you reviewed
           your 30 SPAR models," it seems to me is a little bit
           too much.
                       MR. BARANOWSKY:  I'm Pat Baranowsky, Chief
           of the Operating Experience Risk Analysis Branch, and
           I'd be glad to meet with the Subcommittee or the full
           Committee regarding SPAR models and whether there's
           adequate review or not.  But I would like to point out
           that, as George Apostolakis said, these models have
           been evolving over a number of years, and there's a
           difference between SAPHIRE, which is the tool, if you
           will, and the model, which is the logic that reflects
           the way the plant's built.  And as Steve said, the
           logic has been modified and is currently being looked
           at closely on each one of these models.
                       The assumptions that go into the models,
           for instance, how do model this sequence or that
           sequence and are they complete, are primarily based on
           the insights that we've derived from the IPEs and PRAs
           that are in existence and the accident sequence
           analysis work that we've done over the last 20 years. 
           They're not meant to be models that uncover new
           accident sequences that nobody ever heard of before
           due to unique design or operational characteristics at
           a plant that aren't manifested in operating
           experience.  That's supposed to be the purview of the
           licensee and other types of design and operational
           reviews.
                       So they have a different purpose, and that
           is to say if there's a new sequence and a contributor
           that is unknown, I don't know that we would use the
           SPAR approach to try and find that kind of thing.  It
           reflects what we understand today, our best
           understanding about what should be in risk models and
           a simplified version of them.
                       CHAIRMAN APOSTOLAKIS:  And even more
           significant question, I think, is the level of detail
           that goes into the SPAR models.  And I think the staff
           is still working on that in some sense anyway.  You
           started with very simple models.  Then you went to the
           next level.  And I think as you use them for your
           purposes here, you will probably realize that we may
           do a little more here, a little more there.  That's
           why I'm kind of reluctant to jump into expert panels
           and all that at this point.  Although for SAPHIRE, I
           really think we should have a review, because it's a
           model, it's a tool, it's been out there for years now. 
           It's the official PRA tool of the Agency.  I mean we
           should have a serious peer review, and I think it can
           be done for a tool but for 30 SPAR models --
                       DR. SHACK:  Well, I mean the question is
           if you're going to spend that kind of money, is this
           the way that you would spend it?  I mean there are
           lots of things to spend money on.
                       CHAIRMAN APOSTOLAKIS:  Exactly.  Exactly.
                       MR. MAYS:  Well, I know I kept control of
           the meeting at that point, so --
                       (Laughter.)
                       DR. WALLIS:  Well, let me suggest, George,
           though, that I mean I think Dana's raised an important
           issue.  It may not be the envisioned expert panel is
           the solution, but something to sort of ensure that the
           integrity and completeness of these things would be
           good.  And I don't know what should be done, but --
                       CHAIRMAN APOSTOLAKIS:  I did not object to
           the essence of their argument.  I just thought that it
           was a little bit too soon to do that for the
           individual SPAR models.  Let's do it for the tool
           first and then after you guys say, "Now we have the 70
           models and this is what we're using them for," then it
           seems to me some sort of a review, not necessarily --
                       DR. POWERS:  It seems to me you're begging
           to get into the situation of where we come back and
           say, "Well, these models really aren't what you really
           want, but since you've already built 70 of them, we
           might as well let you go ahead and do this."
                       MR. MAYS:  I think it's a little more --
           we may not have communicated as well, either through
           this document or through other briefings to you, the
           depth of what's going on with the SPAR model
           development and what's been happening over time.  We
           started out very early on in the Accident Sequence
           Precursor Program with just simple event trees, no-
           fault trees, fault probability numbers as an estimate
           of risk significance of events.  We moved to models
           that had more detail in them in terms of the event
           trees that were more up to date with our current
           understanding of success criteria, as PRA evolved
           through 1150 and other things.  And we have
           subsequently expanded that SPAR models and the Rev 3
           down through fault trees to include support states, to
           include uncertainties explicitly in the analysis.
                       So we've made -- we had an outside panel
           in 1992, I believe, come in from all over the place,
           and George was a member of that working group in
           Annapolis where we said from people from industry and
           from academia and from the Agency, "What kind of
           models do we need?  What characteristics do they have
           to have?"  And so this SPAR model development has gone
           along that kind of a development path from the
           beginning.
                       And we also have internally to the Agency
           a SPAR models users group, which are the people who
           have to use risk understanding in doing their
           regulatory business, who are our users and our
           customers who say, "These are the features we need. 
           These are the characteristics it has to have.  We've
           set out a standard in that group for how should we go
           about reviewing these models."  So I think we may have
           a more substantial review process than is patently
           clear from this information.
                       And we agree that the models have to have
           a reasonable reflection of the risk characteristics of
           the plants for the purpose of what we're using here. 
           Our external reviewers, including the industry, has
           told us they want to get the SPAR models and have a
           review of them, and we agree with that.  And so I
           think we're on the same wavelength with respect to
           what needs to be done, and that is we should have SPAR
           models that are a reasonable representation of the
           plant.  How specifically we go about doing them, I
           would propose we save for another day.
                       CHAIRMAN APOSTOLAKIS:  At least until the
           ASME standard is approved.  In fact, I hope that your
           guys on that joint committee that's developing the
           standard know that the Agency's models group is
           subjected to that standard.  That's always a good
           check.  So can we continue?
                       MR. BOYCE:  All right.  I'm on page 4. 
           The first bullet there talks about two Commission
           papers that NRR wrote that laid out the basis for our
           Revised Reactor Oversight Program in early 1999.  And
           as you heard yesterday, we used both performance
           indicators and inspection findings to take regulatory
           -- to have regulatory engagement with our licensees. 
           We ran a pilot program for six months in 1999, and we
           reported to the Commission the results of that pilot
           program in SECY-00-049.
                       And in the SECY paper, we said that while
           the future success of the Oversight Program was not
           predicated on the risk-based PI Program, that we
           thought that risk-based PIs would potentially support
           a couple of areas.  And we said there are certain
           enhancements to our current oversight process where we
           thought risk-based PIs would help.  Those are actually
           articulated in the last bullet.  They're the
           reliability indicators, unavailability, shutdown and
           fire and containment indicators.  And we also thought
           that plant-specific performance indicators would be
           useful in the future.
                       In order to make this happen, NRR wrote a
           user need letter to --
                       CHAIRMAN APOSTOLAKIS:  Let me stop you
           right there, because that's something that has been
           bothering me for a long time.  If you read -- I don't
           know how anyone who reads Appendix F of the report of
           the staff issue in January can say that we don't need
           plant-specific performance indicators.  And in fact
           the evidence there is so compelling that it seems to
           me that the current reactor oversight process, the
           revised process, risk-informed, has to immediately
           start looking again at the thresholds.
                       All I have to do is look at the tables
           that these ladies and gentlemen prepared, and I see
           things that if I use the industry variability curve as
           it is being used now, according to Appendix H of the
           revised oversight process, to get into the red for
           transient initiators, if I observe for three years,
           collect data for three years, I will need 646
           transients.  For loss of feedwater, to get into the
           red, I will need 355.  To get into yellow, I will need
           36.  I don't know which utility or whether this Agency
           would tolerate 36 losses of feedwater in three years
           before it said, "Oh, now you're in yellow.  We have to
           do something about it."
                       It's clear to me, and the mathematics
           shows it, that the thresholds we have now are no good. 
           They're too generic.  If I were running the reactor
           oversight process as it is now and I looked at this,
           I would make it my number priority to revisit the
           thresholds.  Now, tell me why I'm wrong.
                       MR. BOYCE:  Well, it's important to
           remember that the purpose of the performance
           indicators is to help us establish the right threshold
           for regulatory engagement.  I mean they're not
           definitive unto themselves.  Just because you have a
           performance indicator does not mean that you should
           immediately shut down the plant.  It means you should
           do further investigation to look at the causes.
                       DR. POWERS:  It seems to me that he's
           asking the opposite question.  Would you really
           tolerate 36 losses of feedwater in three years and not
           engage the licensee?
                       MR. BOYCE:  Well, I mean, actually, I was
           trying to be supportive of the risk-based PI effort. 
           It sounds like you're suggesting that risk-based PIs
           did not give you the correct indication.
                       CHAIRMAN APOSTOLAKIS:  No.  I was going
           the other way.  The risk-based PIs are giving you an
           indication -- I don't know now; we have to review it
           more and so on -- but they are raising a flag that the
           thresholds you are using now are way off the mark,
           because they're generic.  And for losses of heat sink,
           just as another example, to go to white, which is the
           very first level of alert, right, I will have to have
           19.5 losses in three years.
                       DR. WALLIS:  Which heat sink is that?
                       CHAIRMAN APOSTOLAKIS:  What?
                       DR. WALLIS:  Which heat sink is that?
                       CHAIRMAN APOSTOLAKIS:  The ultimate heat
           sink.
                       PARTICIPANT:  Condenser heat sink.
                       MR. MAYS:  Let me help a little bit with
           that, George.  One of the realities of looking at this
           from a risk perspective is that there are certain
           elements, whether they be initiating events or whether
           they be reliability, availability of particular
           equipment, that have relatively lower risk importance,
           and therefore in order to get to the pre-determined
           thresholds that we have, you have to have a lot of
           events.
                       And the other thing that's important to
           recognize is that all of those thresholds in the
           current ROP, as well as the thresholds that were in
           the initial draft Phase I report that you had from us,
           were based on having one variable out of all the
           variables in the risk analysis change enough to get to
           that threshold, while everything else at the plant
           remained at its baseline performance.  And what you
           see is that for some elements the relative importance
           of that particular element is such that if everything
           else stays at baseline, you really have to change that
           a lot to equate to that level of performance.  That
           tells you something about relative risk.
                       It also tells you that since risk is
           really a multivariate function that you have a
           possibility of sometimes having thresholds that seem
           counterintuitive, because when you see the threshold
           the thought that, "Oh, and everything else had to stay
           at the same value in order to reach the threshold,
           which was the basis for that calculation," isn't
           really obvious to people.
                       And I think it's pretty clear that I would
           expect that before you got to 16 losses of heat sink
           in three years or 15 or ten, that the kinds of
           conditions that would be necessary to make that happen
           would also manifest themselves in other areas of the
           plant performance.  And if we have a process that
           samples those other areas, you're going to see
           multiple areas starting to degrade, and that's what
           will get our attention rather than relying on the fact
           that loss of heat sink is the only thing that's going
           to change at this plant.
                       DR. POWERS:  So what you're saying is
           we've defined the parameter incorrectly.
                       MR. MAYS:  I'm saying the current reactor
           oversight process and the initial pieces that were in
           the risked-based performance indicator report were
           based on a concept which was we'll have a broad sample
           of performance, we'll see how bad each one of those
           individual pieces would have to get if everything else
           was nominal.  That was the basic philosophy.  And an
           implication of that philosophy is that the threshold
           set by that might seem counterintuitive because in
           real life there's more likelihood that you will see
           multiple things go wrong in that case than just one go
           really severely wrong.
                       CHAIRMAN APOSTOLAKIS:  No, but -- no, no,
           no.  I think Dana touched on the real issue here.  I
           think either what Dana said is right, we defined them
           wrong, or the criteria that were used to derive these
           numbers and in the reactor oversight process were not
           the same, and in fact they were not, because you are
           using CDF changes, whereas they are using the generic
           plant-to-plant variability curve for each event.
                       MR. MAYS:  Only for the green-to-white
           interface.
                       MR. BOYCE:  For the white-to-yellow we
           used limited SPAR models.
                       CHAIRMAN APOSTOLAKIS:  I know, I know, but
           for the green-to-white there was a difference.
                       MR. MAYS:  Correct.
                       CHAIRMAN APOSTOLAKIS:  And this other
           thing that you mentioned, I don't know.  I mean we are
           looking at individual indicators.  I don't remember
           anybody making a presentation here that we are looking
           at the combination.
                       DR. KRESS:  That's what the integrated
           performance indicator is supposed to do, isn't it?
                       MR. BOYCE:  Right.  That was in --
                       CHAIRMAN APOSTOLAKIS:  Yes, but the
           indicators themselves were developed on an individual
           basis.
                       DR. KRESS:  Yes.  But they're going to
           integrate.
                       CHAIRMAN APOSTOLAKIS:  Either these
           numbers make sense or they don't.  We can't produce
           different results under different studies and then
           say, "Well, but the other results were okay too."  It
           seems to me that you make a very good case in Appendix
           F that these things have to be plant-specific.  You
           say that clearly when it comes to unavailability.  The
           observability over diesel generators on reliability
           varied greatly across the industry, from 2.5 tenths to
           the minus four for BWR Plant 3 to 2.9 tenths to the
           minus two.
                       Similarly, for RCIC unavailability, and
           there you say, "Weak examination of data for other
           systems revealed similar variation among units. 
           Therefore, we decided that only site-specific data
           were appropriate for estimating the variability of
           outage data at the plant."  Now, if I read this and I
           was running the reactor oversight process, wouldn't I
           worry?  Wouldn't I say, "Am I doing the right thing?"
                       MR. BOYCE:  Yes.
                       MR. MAYS:  George, let me help a little
           bit on that too.
                       MR. BOYCE:  Yes, I agree with you.  In
           fact, that's what we said.  We thought that plant-
           specific PIs were the way to go.  I mean we said that.
                       CHAIRMAN APOSTOLAKIS:  And what I'm saying
           is there is a higher degree of urgency to this than
           just saying, "We'll wait until Mays is done and then
           take the results."
                       MR. BOYCE:  There are other problems.
                       CHAIRMAN APOSTOLAKIS:  Because this tells
           me that -- well, this gentleman has been trying to
           talk for a while now.
                       MR. HOUGHTON:  I'm sorry.  Tom Houghton,
           NEI.  Good afternoon.  I thought I heard you say --
           and perhaps I was wrong -- I thought I heard you say
           that the current program has a very high number of
           loss of heat removal scrams for the green/white
           indicator.  The indicator is two.  You can have two in
           three years is all you can have for the green/white.
                       CHAIRMAN APOSTOLAKIS:  I understand.
                       MR. HOUGHTON:  It's not a higher number
           than that.
                       CHAIRMAN APOSTOLAKIS:  No, no, no.  The
           numbers are quoted from here.  I didn't mean that.
                       MR. HOUGHTON:  Okay.
                       CHAIRMAN APOSTOLAKIS:  But what I'm saying
           -- let me emphasize what I'm saying.  I'm not prepared
           to claim that the numbers we're using now are no good. 
           No, actually I am.
                       (Laughter.)
                       But the numbers we're using now -- no, no. 
           I think I should rephrase this.  I'm not prepared to
           say that.  What I'm saying is that there is sufficient
           evidence from the analysis that is presented in
           Appendix F of this report to convince me that we
           really need plant-specific indicators, plant-specific
           thresholds, and that we should make a much more
           careful study, do much more careful study of the
           observation time and the actual thresholds, of course,
           using methods similar to Appendix F to make sure that
           we have covered these uncertainties, which are
           aleatory and epistemic because now you are really
           dealing with the real world, and increased public
           confidence or at least my confidence that what we're
           doing is really rational.
                       So I guess the reason I -- I guess -- I
           don't guess.  The reason why I'm raising this is
           because I think it's of a certain urgency for the
           existing revised reactor oversight process.  It's not
           something that can wait until you guys are done.  You
           guys means research.
                       MR. MAYS:  There are two issues that kind
           of got woven up here together.  One of them had to do
           with the fact that you can have some fairly high
           numbers for certain -- to get to certain thresholds,
           notably yellow and red, that seem to be
           counterintuitive because the idea is if you had
           anywhere near that number of events, something else
           would have -- we would have been doing them.  And I
           agree that that's a separate thing, and it has to do
           with the nature of having single variate analysis in
           a multivariate picture and the relative risk
           importance.
                       The second point you made was about the
           plant-specific nature.  Now, one of the things we had
           in the discussion here on verification and validation
           is we went back and looked at how the plant-specific
           data and information we had would compare with the
           similar kinds of indicators and information in the
           current reactor oversight process.  So to give you a
           little more -- maybe a little feeling of a little more
           ease, we didn't find substantial differences in the
           overall assessment of things between the risked-based
           performance indicators and the Reactor Oversight
           Program.
                       CHAIRMAN APOSTOLAKIS:  And why do you say
           that?
                       MR. MAYS:  There will be -- in the
           verification and validation section we talk about
           that.
                       CHAIRMAN APOSTOLAKIS:  Chapter 5 of the
           main report.
                       MR. MAYS:  What we did find was that there
           were differences.  Sometimes they were -- the risked-
           based performance indicators indicated that
           performance was worse than indicated in a similar
           version of the reactor oversight process, and
           sometimes they indicated they were better.  And when
           we get to the section where we discuss the alternate
           ways of looking at RBPIs, in light of the comment of
           how many we had, you'll see that the more integrated
           approach that we took in this alternative section
           helps to address both of those issues.
                       CHAIRMAN APOSTOLAKIS:  You know what the
           integrated approach is?  Take the PRA of the plant,
           look at the initiating event number they have, look at
           the unavailability of the system, because they have
           already done it.  And say, "For this number I don't
           want this deviation, I don't want that deviation," and
           then you have the integrated view.  You don't have to
           do anything; the PRA had done it for you.
                       DR. KRESS:  Your threshold would be delta
           CDF --
                       CHAIRMAN APOSTOLAKIS:  Exactly.
                       MR. MAYS:  That's exactly what we --
                       CHAIRMAN APOSTOLAKIS:  And also it will be
           --
                       MR. MAYS:  That's exactly what we did in
           the alternate approach here, George, is we used the
           entire model, and depending on whether we were looking
           at the cornerstone level or whether we were looking at
           a functional level on systems or a response to --
                       CHAIRMAN APOSTOLAKIS:  That's not what you
           do in Appendix F.
                       MR. MAYS:  That's not in Appendix F. 
           That's the alternative stuff that we presented at the
           Subcommittee last month.  The stuff we're going to
           present --
                       CHAIRMAN APOSTOLAKIS:  The ultimate result
           of all this is take the PRA, which comes back to my
           favorite subject of objectives.  See, as you read the
           -- I'll ask you questions.  I'm just talking to NRR
           because they are here, but your turn will come.
                       But the objectives, the objectives are
           extremely important, because you're playing there with
           prior distributions.  Appendix F was written by a
           statistician, I think.  He says, "Well, this number
           doesn't make sense, so I'll use another prior." 
           You'll use another prior because the numbers don't
           make sense?  Perhaps you should be shot first.
                       MR. MAYS:  Well, actually, that's not what
           we did, but that's --
                       CHAIRMAN APOSTOLAKIS:  That's what it
           says.  I can only go by what it says.
                       MR. MAYS:  Well, actually, that's a
           different characterization than I would put on it.
                       CHAIRMAN APOSTOLAKIS:  My point is plant-
           specific PRA, plant-specific thresholds make much more
           sense than anything else, and it's your work to date
           -- I appreciate your valiant efforts to defend your
           colleagues -- but your work to date makes that urgent,
           in my view.
                       DR. KRESS:  And why have thresholds on
           individual performance indicators?
                       CHAIRMAN APOSTOLAKIS:  Well, they went to
           trains, which is very good.  We'll come to that if we
           ever come to that.  I mean they did some good stuff
           there.
                       DR. KRESS:  That would help, but why not
           integrate it all at once?
                       CHAIRMAN APOSTOLAKIS:  At some point.
                       DR. KRESS:  It says suppose you're using
           the PRA and plant-specific.
                       CHAIRMAN APOSTOLAKIS:  Well, there are two
           competing --
                       DR. KRESS:  Call Bob Christie and say,
           "Let's say the performance indicator on delta CDF."
                       PARTICIPANT:  Is Christie here?
                       CHAIRMAN APOSTOLAKIS:  No, no, no, no, no. 
           There are two competing --
                       MR. MAYS:  He was, but he got scared and
           left.
                       (Laughter.)
                       CHAIRMAN APOSTOLAKIS:  -- elements here: 
           One is to be as high as you can, as you say, to go the
           Christie way, and the other counter argument is that
           you want something you can observe.  So you have to go
           -- that pulls you down, the other thing pulls you up,
           and you have to --
                       DR. KRESS:  No, no.  But you're observing
           the things that go into the PRA to make the delta CDF
           calculation.
                       CHAIRMAN APOSTOLAKIS:  Yes.  And that's
           what these guys are doing.  And then they come back
           and they tell you --
                       DR. KRESS:  Yes, but don't put the
           threshold on those, because they're determined, just
           like he said, as if all of them say the same except
           that one.  Just look at all of them and integrate the
           total change and see the effect on delta CDF and put
           a threshold there, rather than have individual colors
           for each PI.
                       CHAIRMAN APOSTOLAKIS:  In an ideal world,
           that's the way it should be done.  You are asking the
           Agency to take a gigantic step away from micromanaging
           all the way out, and they will never do that.  So
           let's hope that they will go to the trains that these
           guys are offering now, and then maybe later --
           because, remember, we're going to discuss option two
           a little later.
                       MR. BOYCE:  I'm on page 5 now.
                       (Laughter.)
                       Actually, I mean, we're challenged as to
           why we just don't do it immediately.  And that giant
           step forward is, I mean, really what we're facing
           here.  And we think that there's certain key
           implementation issues that need to be looked at before
           we go and take that giant leap forward or if we take
           that giant leap forward.
                       And the ones that we've already discussed,
           data quality and availability, SPAR model development,
           and V&V.  The V&V that I'm referring to was -- it's
           not enough that we developed the SPAR models, we need
           some way to gain what we were looking at was
           acceptance by the licensees and the public, that the
           SPAR models were going to give you a reasonable
           answer.  And we weren't saying a perfect answer, that
           we modeled all possible events and all possible
           scenarios; we were just saying a reasonable answer
           with which we could regulate.  So I think we had
           identified these issues.  They're in Section 5 of the
           Phase I report.  And I won't go into more of that.
                       I did want to make one more comment on
           data quality and availability.  The reliability data
           is coming from a database that is called EPIX.  It's
           run -- that database, I think, is collected by INPO,
           and it's the successor to NPRDS.  And it was in
           response an AEOD initiative for a reliability data
           rulemaking, and industry said they would stand up EPIX
           and populate it in lieu of that data rule.  And that
           was about 1997 time frame.
                       And industry has in fact followed through
           on that effort, but it's still a voluntary initiative. 
           We don't have a requirement.  There's no rulemaking
           that says anybody needs to submit data.  Even the
           current reactor oversight process is still voluntary
           submission of data.  And we haven't taken a close look
           at the EPIX database to say that there is 100 percent
           participation in submission of data.  We haven't said
           that there is consistency in terms of submission of
           that data.  And we haven't done verification of that
           data.
                       CHAIRMAN APOSTOLAKIS:  Where would you get
           your data?  The current process, where does it get its
           data?
                       MR. BOYCE:  The reactor oversight process
           is submitted directly from licensees to the NRC on a
           voluntary basis.
                       CHAIRMAN APOSTOLAKIS:  And why can't I do
           that with risked-based performance indicators? 
           Remember, I am not advocating generic numbers, so I
           don't need to have assurance of the whole of industry
           in submitting data.  I will do it on a plant-specific
           basis.
                       MR. BOYCE:  It does go back to acceptance. 
           I mean industry -- we worked very closely with
           industry in order to get where we are today on the
           current reactor oversight process.  Industry has
           already publicly stated that if we add -- I think
           we're looking at an additional 30 performance
           indicators, that they may not accept that on a
           voluntary basis, because it's a huge additional
           burden, and it opens up the potential that if you have
           more performance indicators, you'll have more
           opportunities across thresholds, you'll get more
           regulatory attention.  And they want to understand is
           it really warranted?  And we've heard that -- I think
           you heard that at the Subcommittee meeting, and we've
           heard that at public meetings.
                       And so we are working through these sorts
           of issues, and that's implementation.  And it's got to
           be acceptable to all parties in order for this to work
           correctly.  They own the data, they need to help with
           the models and make sure they're right, and it's got
           to be a cooperative effort.
                       CHAIRMAN APOSTOLAKIS:  Okay.  Now, again,
           for me that's a non-issue, and let me tell you why. 
           This is a plant-specific issue, and this Agency has
           already done similar things on a plant-specific basis. 
           But the Maintenance Rule, I didn't hear anybody
           complain about data at that time.  You asked the
           licensee, "Tell us what the threshold should be," and
           there is a rule out there, and we're using it.  Why
           can't we do the same for the oversight process?  "Mr.
           Licensee, tell us in the integrated model, for
           initiating for this and that, what would be the
           thresholds?"  And, of course, we look at them, we
           study them, we create an Appendix F, blah, blah, and
           then eventually we agree.  We've done it for the
           Maintenance Rule.  What's so difficult with this?
                       MR. BOYCE:  I guess you need to weigh the
           costs and benefits.  When you go to the Office of OMB
           and we need to justify that the benefits would exceed
           the costs.
                       CHAIRMAN APOSTOLAKIS:  Okay.
                       MR. BOYCE:  I mean that's one bureaucratic
           hurdle.
                       CHAIRMAN APOSTOLAKIS:  I understand that,
           but at the same time this is hailed as a -- the
           revised process is hailed as the major regulatory
           change of the last 20 years.  But I don't want to
           elaborate the point too much.
                       There is one other major issue that I
           think has not been addressed, neither by this project
           nor by the revised oversight process.  And because it
           has not been addressed, we see a lot of problems here
           and reaction from NEI.  It seems to me that somebody
           should study the tradeoffs between using a performance
           indicator and baseline inspection.  The way we appear
           to be handling this is we are looking at the
           performance indicators.  Now these guys come up with
           a total of 30 or so.  The industry says immediately,
           "Wait a minute now.  How many are we going to have?" 
           Because the industry doesn't see on the same piece of
           paper we're going to have these indicators, and we
           will relax the Baseline Inspection Program in these
           areas, because these areas are covered by the
           indicators.  As long as you don't see that tradeoff,
           you will have these objections all the time.
                       So it seems to me that's a high-level
           issue of equal importance as the previous one, but I
           think both, maybe this project and most importantly
           the people who run the oversight process, they should
           address, because otherwise we'll have this perennial
           problem.  We have one transit indicator.  Now you want
           to make them four, I think, or some three or four. 
           Why?  What kind of tradeoff is that?  You're just
           increasing the burden.
                       MR. BOYCE:  I think philosophically we
           agree with you.  We would like to say that our revised
           reactor oversight process was in fact a significant
           step in that direction.  When we took a look at going
           from our Core Inspection Program to our Baseline
           Inspection Program, we did exactly that sort of
           approach, conceptually.  We took the best data that we
           had available at the time, and we said this is the
           sort of PIs that we can get insights on a specific
           area of plant performance, and we don't need to do
           additional inspection in that area.  I think you know
           that -- I mean that effort was limited, but we're
           pragmatists here.  We're getting to that point, and we
           can't expect perfection on the first try.
                       The risk-based PI report, as you also
           know, laid out a systematic approach to here are the
           accident sequences, here's the data you can collect
           for performance indicators, here's the data you can
           collect on an industry-wide level, and here's the gaps
           that could be covered by inspection.  And you brought
           that up at the Subcommittee.  We think that sort of
           approach has got merit.  We would like to see the
           effort move to be more mature and gain greater
           acceptance before we say, "Okay, let's charge
           forward."
                       But in the meantime we have done a
           separate effort where we're taking the significant
           risk insights from various studies such as the
           Initiating Events study, and research has provided
           that to our inspectors and is providing those sorts of
           insights to the Inspection Program Branch, and we're
           attempting to incorporate those significant insights
           into our current inspection procedures.  It's not
           perfect, but at least it's a step in the sort of
           direction that you're alluding to.
                       CHAIRMAN APOSTOLAKIS:  Now, Steve, I
           understand it's not part of your charge to look at
           these tradeoffs.  You're just looking at the
           feasibility of having certain indicators, right?
                       MR. MAYS:  That's correct.  We were
           looking at what could be technically feasible using,
           basically, off-the-shelf and readily available models,
           tools, and data.  And I think we should point out that
           the Reactor Oversight Program has, as an integral part
           of it, a change process where proposals to change the
           indicators and the reactor oversight process can go
           through.  And that process involves meetings with
           internal and external stakeholders, understanding of
           what the implications of the information is, and an
           opportunity to look at what the potential costs or
           benefits are as part of the reactor oversight process
           change process.  We've only gone through a couple of
           different things in the oversight process from that
           standpoint, but I do think we have a mechanism for
           doing that.
                       So I believe what we raised in the report
           was based on our understanding of the models, methods,
           and data and where this would potentially fit in the
           oversight process.  We said these are what we think
           are the key implementation issues.  And from our
           discussions with internal and external stakeholders,
           we've got pretty good agreement that those really are
           the issues and that the process for dealing with those
           issues is through the ROP change process.
                       CHAIRMAN APOSTOLAKIS:  Maybe there is a
           process, but I think the process does not emphasize
           enough that within the process we are doing these --
           we are making these tradeoffs between baseline
           inspection and performance indicator in a systematic
           way.  Because otherwise, if everything is so good, why
           is industry complaining that you are trying here to
           increase the burden?  Surely, they must know what the
           process is all about.
                       But I think we're running out of time
           here, so can you tell us what the real message you
           want to send us is by summarizing your --
                       MR. BOYCE:  I think that NRR is cautiously
           supportive of the Risk-Based PI Program.  We would
           like to try and engage industry further to resolve
           their comments on burden using the technical merits of
           this product and perhaps taking a look at our
           inspection practices to see if there's some solution
           to those.  And we'd like to try and keep moving
           forward with this effort.  We've endorsed it in a user
           need letter, and we'd like to see the results.
                       I think that right now the comment period
           on the Phase I report expires on the 14th of May, and
           we're going to take a look at the comments that we get
           and try and deal with them.  And I think the schedule
           for issuing this Phase I report is November time
           frame.  So we hope to address some of those issues
           between now and then.
                       MR. LEITCH:  I have a question about the
           unplanned power change indicator that's in the ROP
           now.  And my question is not so much about the
           definition, and I understand that may be up for
           reconsideration, the precise definition of that.  But
           that kind of information, unplanned power change,
           seems to me to be a valuable indicator, and I
           understand that it doesn't really have any linkage to
           risk.  In other words, the risk-based -- that kind of
           an indicator would not be in a Risk-Based PI Program.
                       And my question, basically, is if we go to
           risk-based, is the thought that we have to be all
           risk-based?  In other words, would an indicator such
           as that necessarily fall by the wayside?
                       MR. BOYCE:  That goes back to the earlier
           question I think we had on the thresholds for certain
           of the indicators and why we have particular
           indicators.  When you get to the pragmatics of
           regulating, you end up doing some things that are not,
           say, fully consistent with risk techniques, like the
           scram indicator.  Scrams you can tolerate, I don't
           know, 25 on a plant before you get past ten to the
           minus six CDF.  And yet we have found by comparing the
           scram indicator to what used to be our definition of
           problems plants -- the watch list and near-watch list
           type of plants -- there was a fairly good correlation
           between plants that had a high number of scrams and
           plants we thought were problem plants.
                       And so in terms of regulatory engagement,
           we found the scram indicator to be a very useful
           indicator.  So I can't prejudge a decision as to where
           we would be, but we think we would probably continue
           that scram indicator for that reason.  And we think
           that risk-based PIs could be an enhancement to our
           current set of indicators, perhaps replacements for
           many, but we would retain certain ones because they
           other insights beyond pure risk.
                       MR. LEITCH:  And the power changes made
           could very well be one of those?
                       MR. BOYCE:  It could be.  I don't want to
           get ahead of the problem, but it could be.  All right. 
           I'll turn it over to Steve on page 6.
                       MR. MAYS:  In light of the fact that we
           now have about 40 minutes left for the section we
           expected to take between five and ten minutes in the
           initial phase, I think we may need to address an
           abbreviated version even of what we have here.  If
           it's suitable to you, George, I would like to skip
           down to the sections that you asked at the
           Subcommittee that we specifically go to, which means
           I will skip over the information about the potential
           benefits and our development process.  And I want to
           go first to the table, which is on your page 8 and
           give you a flavor of what we had from the draft Phase
           I report and then move into the specifics of what we
           had in those areas you asked us to spend more time on.
                       This table shows what's in the existing
           Reactor Oversight Program as PIs and what areas
           through our development and work we've determined as
           proposed risked-based performance indicators.  We went
           over in greater detail the derivation of these in the
           Phase I report with the Subcommittee, so we've only
           put a summary of what that information is here.  This
           shows that the RBPIs cover more and often different
           aspects of the impacts of performance on plant-
           specific risk.  And we'll show you some more specific
           results and calculations in the V&V discussion. 
           You'll note that there are a couple of asterisks on
           this chart that indicate potential performance
           indicators that we either didn't have all the models,
           data or capability to put together PIs right now,
           although we think they might be something we could od
           in the future.
                       CHAIRMAN APOSTOLAKIS:  Now what you're
           saying with this table, Steve, the way I understand
           from the discussion so far, is that, yes, the Risked-
           Based Performance Indicators Program identify more
           potential indicators for mitigating systems, for
           example.  But you are not necessarily advocating that
           these be adopted.  You are saying these are feasible. 
           And it's another decision whether, you know, we want
           to use all of them, what to do with the baseline
           inspection, and so on.
                       MR. MAYS:  That's correct.
                       CHAIRMAN APOSTOLAKIS:  That's the way I
           see it.
                       MR. MAYS:  That's correct.
                       CHAIRMAN APOSTOLAKIS:  Okay.
                       MR. MAYS:  A notable thing also that we
           want to bring your attention to is we had been asked
           by NRR in their user need letter, as I mentioned
           before, to see what we could do to come up with
           indicators for shutdown, fire, and containment areas. 
           And we're going to talk about what we came up with for
           shutdown.  We were unable to produce performance
           indicators for fire and containment because of either
           lack of models or lack of available data.
                       We have three things we need to develop a
           risked-based performance indicator for potential use. 
           The first one is a model that reasonable reflects the
           risk, and the key word there is reasonable; not
           perfect but reasonable.  The second one is we have to
           have baseline performance data to put into the model
           so that we can vary that through sensitivity analysis
           to see where the threshold should be set.  And the
           third thing we need is an ongoing source of data to
           compare that performance to the thresholds.
                       In the case of fire and containments, we
           were lacking in both models and data.  In the case of
           shutdown, we were able to find models and a baseline
           performance and information to potentially use the
           PIs.  But also in the shutdown, we're not currently
           gathering the data right now, but it's something we
           believe is potentially able to be done relatively
           easily.  So we've gone ahead with the shutdown
           performance indicators to discuss those.
                       CHAIRMAN APOSTOLAKIS:  So you're not
           necessarily saying that a shutdown PRA is better than
           the fire PRA.
                       MR. MAYS:  Correct.  I'm not saying that.
                       CHAIRMAN APOSTOLAKIS:  I think you have a
           question coming from somewhere there, no?
                       DR. POWERS:  Can I ask a question about
           your Mark I containment spread?
                       MR. MAYS:  Sure.
                       DR. POWERS:  Correct me if I'm wrong, but
           I believe that containment spread of Mark I is
           connected also to the low-pressure injection system.
                       MR. MAYS:  That's correct.
                       DR. POWERS:  And most of the Mark I
           containments have blanked out the containment spread;
           it's non-operational.
                       MR. MAYS:  I'm not --
                       DR. POWERS:  It requires a manual change
           to make it active.
                       MR. MAYS:  Not that I'm aware of.
                       MR. HAMZEHEE:  I don't think we noticed
           that in our work.
                       DR. POWERS:  I could be wrong about that.
                       MR. MAYS:  Not that I'm aware of.
                       DR. POWERS:  I don't think I'm wrong but
           I could be.
                       MR. MAYS:  I believe they're manually
           initiated, but I don't believe they're -- I don't
           think they have an automatic set point where they come
           on, but I believe that they are still capable and
           functional in the systems.
                       MR. LEITCH:  They're operated from the
           control room.  It requires manual actuation from the
           control room.
                       MR. MAYS:  In the area of shutdown for
           performance indicators, the Subcommittee asked us to
           spend a little time on that.  The process we used here
           is a different approach slightly from what we did with
           the other types of indicators that you've seen, either
           in the ROP or in the other parts of the RBPI report in
           that this indicator is more a measure of the impact of
           configurations during a small period of time, the
           outage, as opposed to an accumulation of performance
           data over time, such as the reliability of a pump or
           the frequency of an event that you would track over
           time and history and be able to trend.
                       This has been linked more towards a SDP
           type analysis of conditions than the standard
           classical indicator definition, and we recognize that
           that's the case.
                       Let's go to the next page here.  The key
           in this process was the acknowledgment that there are
           certain necessary combinations of decay heat, reactor
           coolant system inventory, and equipment availability
           the utility must go through in order to conduct a
           refueling outage.  So we wanted to be able to take
           into account that that was something that was a
           necessary part of operations.  It had some risk
           associated with it.  And if we were going to make
           performance indicators associated with shutdown
           operations, we had to allow that particular portion of
           the risk to be there without penalty.
                       So the baseline risk was taken into
           consideration.  We looked at shutdown PRAs.  We looked
           at information about plants and how long they were
           spending in various conditions in shutdown.  And we
           took that indication in the baseline information
           that's on these tables for BWR and PWR.
                       Then we looked and said how much time
           would somebody spend in categories of high, low,
           medium or early reduced-inventory vented conditions
           that would result in accumulation of risk in addition
           to that baseline.  And we set the thresholds according
           to that to be consistent with the ROP thresholds of
           ten to the minus four, ten to the minus five, and ten
           to the minus six delta CDF associated with being in
           performance areas outside the norm.
                       DR. KRESS:  If the containment is
           compromised during that same period, why should you
           use those same deltas as your criteria?  Shouldn't you
           have a more stringent delta?
                       MR. MAYS:  The issue of containment was
           one where our problem is model availability to be able
           to assess what the risk implications and set
           thresholds are with respect to that.  We're basically
           going off of core damage frequency here, because
           that's what we have the readily available models to
           do.
                       DR. KRESS:  But I would have thought you
           might have gone a little more severe in the thresholds
           for those.
                       MR. MAYS:  The problem we faced there was
           --
                       DR. KRESS:  Maybe five or ten.
                       MR. MAYS:  What?
                       DR. KRESS:  Maybe five or ten.
                       MR. MAYS:  Maybe.  The problem there is it
           was, again, what factor do you use and what's your
           basis for saying that that particular factor has an
           implication to public risk.  And we just were not
           capable of doing that in this particular analysis.  I
           don't disagree, because we said in the report that
           having containment models for both at-power and
           shutdown conditions would give us the ability to
           determine what the impacts were on those, which we're
           not able to do now.
                       So what we have here is baseline
           information.  And then on the next two slides what we
           have is examples of configurations associated with
           specific times, decay heats, and RCS conditions that
           a plant might be in during a shutdown outage.
                       CHAIRMAN APOSTOLAKIS:  So your indicator
           here is the time the plant spends in that state?
                       MR. MAYS:  That's correct.  So what we
           would do, for example, is you'll have examples on this
           table where if you have a diesel generator out under
           a certain set of conditions, the table will tell you
           whether that's a low, medium, high or a nothing in
           terms of how much you need to accumulate.  So you
           would accumulate all the time you spent in those
           conditions under the low, add them all up and see if
           that exceeded the threshold.  You do a similar thing
           for medium, a similar thing for high.
                       Now, there is one special case we have
           here, which is called the early reduced-inventory
           vented condition, which in order to do shutdowns
           plants are often having to go into mid-loop, install
           nozzle dams, do other kinds of things to conduct their
           outages.  Early on in the regulatory business, there
           was a shutdown rulemaking effort that was underway. 
           There was an agreement made that there would be a
           process by which the industry would put together a set
           of standards for dealing, for how they would conduct
           outages under those conditions.
                       So this indicator that we've proposed here
           recognizes that condition and says, "If you are
           conducting early reduced-inventory vented conditions
           in accordance with the, I believe it was NUMARC 9106
           guidance for shutdown configuration control, that we
           would set our thresholds assuming that you had those
           configurations met.  If you're not in those
           configurations in accordance with that document, you
           would automatically transfer into the high category
           under this scheme, which is a more severe and more
           limiting setup.
                       So we're trying to give appropriate credit
           for the baseline of what you have to do to get into a
           shutdown and refueling.  And then indicate if you've
           done performance issues that exceed that, what their
           potential risk significance is.  And so we also have
           another slide here which gives the BWR corresponding
           conditions for that.
                       CHAIRMAN APOSTOLAKIS:  Now these times are
           the cumulative times over a period.
                       MR. MAYS:  The cumulative times over the
           refueling outage.  So, for example, if you're a plant
           operating state 4, hot shutdown with the RCS boundary
           in tact, and you had a diesel generator out of service
           for a certain time, that would be a low in this chart. 
           So you'd add up that time.  And any other low times
           that you were in during that outage would all be
           counted together, and you compare that to the
           thresholds on the previous page to see whether you had
           exceeded the threshold or not.  And if you're in an
           area of operation where it's a blank cell, you can be
           in that as much time as you want.
                       And, again, the industry commented during
           our public meeting that we had the week after the ACRS
           Subcommittee that they believed this tool was probably
           more appropriate to use as a significance
           determination process type of tool rather than a
           performance indicator type tool.  In other words, you
           would use this tool to determine after the fact, if a
           plant was in a certain outage condition, whether that
           outage condition was really important or not.
                       CHAIRMAN APOSTOLAKIS:  But it seems to me
           that in order to go to the SDP, some sort of deviation
           from something has to be observed.  What is that
           something in this case?  If you don't have an
           indicator and a threshold, why would you even enter
           the SDP?
                       MR. MAYS:  Well, the issue there would be
           is this somebody's had discovered as part of their
           outage, for example, that they had had equipment out
           of service, like two diesel generators, when they
           weren't planning on it originally.  And you would go
           back into something like this process and say, "Well,
           what was the risk associated with being in that
           condition for however long you were in it?  What were
           the RCS conditions and the decay heat conditions when
           you were in that?"  And you would make an assessment
           based on this kind of an approach.
                       CHAIRMAN APOSTOLAKIS:  I guess I don't
           understand how you would decide to make the
           assessment.  Don't you have to deviate from something?
                       MR. MAYS:  I agree you do --
                       CHAIRMAN APOSTOLAKIS:  The SDP says -- I
           mean the examples we heard yesterday were they forgot
           to do a test.  They're supposed to do a test; they
           didn't do it.
                       MR. MAYS:  Yes.  Right.
                       CHAIRMAN APOSTOLAKIS:  So that's sort of
           a violation of some sort.
                       MR. MAYS:  Right.
                       CHAIRMAN APOSTOLAKIS:  So now you enter
           the SDP or in another instance what did they do? 
           There was something else.  But if it's something
           they're supposed to do and they didn't do it, then I
           go to SDP.  If I don't have an indicator here, what is
           that something that will make me go to the SDP?
                       MR. MAYS:  I'm not aware of what that
           would be.
                       MR. BARANOWSKY:  This is Pat Baranowsky
           again.  Are you done, Dr. Kress?  Am I interrupting?
                       DR. KRESS:  Go ahead.
                       MR. BARANOWSKY:  What I was going to say
           is that remember the industry is committed to
           following certain guidelines during shutdown.  And one
           of the things we do in inspections and verify that
           they've followed those things.  So as part of the
           inspection they might verify that they were operating
           in accordance with those guidelines, which could then
           be fed into this model, if you will, to assess the
           findings associated with that.
                       CHAIRMAN APOSTOLAKIS:  So the point is
           that one way or another you have to have some sort of
           --
                       MR. BARANOWSKY:  Yes.  There has to be a
           way to get in there, but I believe there is a way. 
           Maybe Tom Houghton could help me.
                       MR. HOUGHTON:  Yes.  You have to have a
           performance issue, meaning either you have some event
           or occurrence or you have some violation or behavior
           which is viewed as suspicious in some way.  If you're
           not following a procedure, you're committing a
           violation, and that procedure's significance, that
           violation could be assessed using this process.  Is
           that helpful to you?
                       CHAIRMAN APOSTOLAKIS:  That could be,
           could be.
                       MR. HOUGHTON:  Yes.  I mean if you viewed
           -- if you looked and there was a tech spec violation
           in terms of having RHR capability, you could use this
           process to determine what the risk impact of that was
           and put it in perspective.
                       CHAIRMAN APOSTOLAKIS:  It seems to me,
           though, the industry should be arguing the other way,
           because this already allows you some, quote,
           "violation" without anything happening.  Now you're
           saying, no, I will have the procedures.  If I deviate
           a little bit, I will have to go through the whole
           process, which doesn't make sense to me.  Because this
           already has built into it what's allowed.  So I don't
           have to do anything else.
                       MR. HOUGHTON:  Well, I think it's a little
           different than what's allowed, because, for instance,
           there's not a limit on mid-loop operation.  However,
           if you look at the thresholds built into this, one
           might find oneself crossing a threshold when you're
           doing the perfectly right thing, which is if you're
           having a problem, not to rush through to keep the
           hours under two hours.  And in fact the most difficult
           time -- the most risky time is going in or coming out
           of the mid-loop.  So here I am.  I'm approaching, the
           clock's ticking off.  I've about reached the
           threshold.  Two more hours I go into a yellow
           threshold when I really should be stopping all work
           and saying, "Let's find out what's wrong.  Let's plan
           and do it correctly."  So that's part of the concern
           about the --
                       CHAIRMAN APOSTOLAKIS:  You are really
           discouraging people from doing the cautious thing.
                       MR. HOUGHTON:  It may or may not, and we
           need to look at that more carefully.
                       CHAIRMAN APOSTOLAKIS:  No, that's a very
           valid point, I think.  Which brings me to my other
           favorite topic.  This implies that what really
           controls the risk here is the time of -- the duration. 
           I don't like that.  Because that means that no matter
           how high the risk is during that time, as long as the
           exposure is short, we're okay.
                       DR. KRESS:  Well, wait a minute, George.
                       CHAIRMAN APOSTOLAKIS:  I know.
                       (Laughter.)
                       DR. KRESS:  And you're a PRA guy.
                       CHAIRMAN APOSTOLAKIS:  I know.  You and I
           have disagreed about this in the past.  I don't see
           why we can't disagree today.
                       DR. KRESS:  Yes, okay.
                       DR. POWERS:  Well, I guess the point I
           would appreciate a little advice on is the summation
           of hours.  I mean if I enter a medium configuration
           for two hours and then I come out of it, go along and
           I find I have to go back in to it, why should I sum
           that previous two hours?  I escaped scott-free there. 
           Why shouldn't it be the continuous period that I'm in
           there that gets evaluated?
                       MR. MAYS:  Well, I think the answer is
           that this -- if you're in for two hours and you come
           out and you go back in for two hours, we're measuring
           the accumulation of risk that you've incurred over
           this outage.  So what we're doing is saying over this
           outage the accumulation of risk you have incurred by
           being in these states which have relatively high risk
           significance is what we want to know.  We don't want
           to -- you know, the idea then, if you --
                       DR. POWERS:  I think I understand what
           you're doing.
                       MR. MAYS:  If you didn't have that
           philosophy, then you could be in the high risk thing
           for up to one hour before you get to threshold, back
           out, come back in for a few minutes, go back up to it
           again, and you would just never be there.  And, in
           effect, you would have been there the whole time.
                       DR. KRESS:  Well, what bothers me about
           that is -- I think it's a reasonable thing, but what
           bothers me is how do you add high and low and medium
           together?
                       MR. MAYS:  Well, that's the thing we
           haven't done here, we haven't done here.
                       DR. KRESS:  Yes, I know.  By the same
           concept, it has to be done some way.  So that's the
           one that bothers me about it.
                       CHAIRMAN APOSTOLAKIS:  Why do you have to
           add high and low?
                       DR. KRESS:  Because they represent the
           cumulative risk.
                       MR. MAYS:  That would be the cumulative
           impact of the entire thing.
                       DR. KRESS:  You can't just add the times.
                       CHAIRMAN APOSTOLAKIS:  That's what I'm
           saying, but why would you have to add them?
                       DR. SHACK:  Well, he says he's interested
           --
                       DR. KRESS:  He's accumulating risk. 
           You've got to accumulate off of this.
                       CHAIRMAN APOSTOLAKIS:  But I thought it
           was cumulative for each category.
                       MR. MAYS:  It is cumulative for each
           category.
                       CHAIRMAN APOSTOLAKIS:  Not in total.
                       DR. KRESS:  Yes, but that doesn't make
           sense.
                       MR. MAYS:  Dr. Kress is raising the
           question as if I am in the white for my low and the
           white for my medium and the white for my high, what's
           the net total effect?
                       DR. KRESS:  Are you not in red overall?
                       MR. MAYS:  And I haven't gone to the
           further step of accumulating that all together,
           although that could be done.
                       CHAIRMAN APOSTOLAKIS:  Isn't that an issue
           for the Action Matrix?
                       MR. MAYS:  That's the way we set it up to
           do it here, but that's another thing where we could,
           as we're doing the alternate approach, we could
           potentially accumulate them all together as well.
                       MR. HAMZEHEE:  We have the same thing for
           at-power situations.  We don't have an accumulative
           impact measurement right now except the Action Matrix. 
           So you have the same situation.
                       DR. KRESS:  Yes, you would, absolutely.
                       CHAIRMAN APOSTOLAKIS:  When it doubt, give
           it to the Panel.
                       MR. MAYS:  Okay.  The next thing we wanted
           to talk about was the work associated with how much
           risk coverage do we have with these RBPIs and what's
           the verification and validation that we've done?  What
           I want to do here is indicate that we have gone back
           and looked at this from two different standpoints: 
           One from kind of a false vessel approach, one from a
           risk achievement worth kind of approach.
                       And I'd like to put up the next slide,
           which shows one of the comments that was made earlier
           about how do you use risked-based performance
           indicators versus risk-informed baseline inspection? 
           So what we did and what's in the report for all the 23
           plants that were in the Phase I report is we went back
           and we went through the IPE database, which was
           compiled after all the IPEs were put together, as to
           what the dominant sequences were at the various
           plants.
                       And what we have in this graphic display
           is we have a box around all of the areas that are part
           of the dominant sequences in the IPE database where we
           either have a risked-based performance indicator, we
           either have an industry trending information or we
           have an initiating event indicator.
                       And what you can see fairly quickly just
           from looking at this is there aren't very many
           dominant sequences for which we don't have some
           multiple way of looking at what the performance of the
           plant has been with respect to dominant sequences.
                       The other thing that it also tells you is
           the areas in the dominant sequences for which we don't
           either a mitigating system indicator or an initiating
           event indicator or an industry trend are areas that we
           should be covering in a risk-informed baseline
           program.
                       So to answer you earlier question,
           although it's not on this particular chart, you could
           potentially go into this and say, "Okay.  If I've got
           these things covered by indicators, what are the
           things I should have in my Risk-Informed Baseline
           Inspection Program?  I think that's one of the
           valuable things that this particular program has done
           is to make that more clear from a risk perspective
           what those particular areas should be.
                       CHAIRMAN APOSTOLAKIS:  And I still have
           the issue, though, that you raised, which is, is it
           really fair -- maybe you didn't put it in the same
           words -- but is it really fair or reasonable to take,
           say, the first box there, TRX, okay, number 8,
           sequence number -- no, eight is no good.  Tell me what
           the sequence means.  I can start with a TRX and then
           I have the HPCT?  Is that what that means?
                       MR. MAYS:  This was a sequence where you
           had a transient and you had failure of the automatic
           depressurization --
                       CHAIRMAN APOSTOLAKIS:  Oh, okay.
                       MR. MAYS:  -- and failure of DC power.
                       CHAIRMAN APOSTOLAKIS:  And for these I can
           or cannot have --
                       MR. MAYS:  I don't have risked-based
           performance indicators for those.  So those will be
           areas that should be covered for that particular
           function through the Risk-Informed Inspection Program.
                       CHAIRMAN APOSTOLAKIS:  The baseline.
                       MR. MAYS:  Right.
                       CHAIRMAN APOSTOLAKIS:  So you are
           addressing that issue now here, the tradeoffs.  Very
           good.  But let's look at number 23 where I have the
           same transient, but now you're telling us with the
           boxes that I can have indicators for the two
           mitigating systems, right, RCIC and HPCT.
                       MR. MAYS:  That's correct.  And the reason
           this one was --
                       CHAIRMAN APOSTOLAKIS:  Now, wait a minute. 
           Let me finish my thought.
                       MR. MAYS:  Okay.
                       CHAIRMAN APOSTOLAKIS:  So now when I set
           my indicator here, my thresholds, I should take into
           account, I think, in some way the fact -- I mean is it
           reasonable to set the threshold in such a way that TRX
           alone, its frequency, should trigger a ten to the
           minus five or six change in CDF?  I thought you were
           arguing earlier that doesn't make sense.  You
           shouldn't do it one at a time.
                       MR. MAYS:  What we did was we had -- when
           we, in the Phase I report, looked at, for example, the
           HPCI train reliability, we said if the HPCI train
           reliability changes and everything else stays the same
           for all the sequences, what would be the change in CDF
           associated with that?  So it wasn't just associated
           with TRX; it was associated with all the sequences for
           which HPCI would be affected.  However, it assumes
           that RCIC, the transient frequency, the LOCA
           frequency, the diesel generator reliability are all at
           their nominal values.
                       CHAIRMAN APOSTOLAKIS:  So I thought you
           meant something else then.  But I think using the
           plant-specific PRA, I can work with these things and
           define the indicators at an appropriate level so that
           I take advantage of these indications that I have now. 
           I'm not prepared myself now to tell you how to do
           that, but I think that's a good thought.
                       In other words, on the one hand, as we
           said earlier, the PIs should be as high as possible on
           the PRA where the CDF is at the top, okay?  And I will
           try to do that as much as I can with the sequence.  On
           the other hand, I have this issue of having to observe
           some data, which pulls me down, okay?
                       DR. KRESS:  You're going to have a really
           tough time there, George, because what these PIs are
           is a sample --
                       CHAIRMAN APOSTOLAKIS:  That's correct.
                       DR. KRESS:  -- of things that are part of
           the PRA.  And you're sampling a limited -- it's a
           limited sample, and you're going to look at the
           degradation of all of them, and some of them may have
           improved, actually.  But what you're going to try to
           do is now infer from that what the total plant change
           has been on all the things that affect the PRA
           results.  That algorithm doesn't exist, and that's the
           problem right here.  And I don't think you can set
           individual thresholds on these things without that
           algorithm, and that's my problem.
                       CHAIRMAN APOSTOLAKIS:  Okay.  But maybe
           there is another way out of this.
                       DR. KRESS:  The other way out of it is to
           use the Bob Christie -- here is here now.
                       CHAIRMAN APOSTOLAKIS:  No.  Christie is
           only one element of this.
                       DR. KRESS:  And set the threshold on delta
           CDF itself.
                       CHAIRMAN APOSTOLAKIS:  No, no, no.  But
           there is another way of doing this.  You remember that
           this Committee has asked the staff to explain how the
           Action Matrix was developed and what does it mean --
           why two reds make this and one yellow and one white.
                       DR. KRESS:  Yes.  That impacts --
                       CHAIRMAN APOSTOLAKIS:  We can use this
           table now --
                       DR. KRESS:  That impacts on that.
                       CHAIRMAN APOSTOLAKIS:  -- to scrutinize
           the Action Matrix --
                       DR. KRESS:  You're right.  That would --
                       CHAIRMAN APOSTOLAKIS:  -- rather than
           worrying about the thresholds for individual events,
           which have the problems we mentioned.
                       DR. KRESS:  But once again, in order to do
           that, you have to have this missing algorithm that I
           talked about that says the total effect on the whole
           PRA, due to the changes, which are variable, variable
           changes, and going in different directions, you have
           to have some sort of algorithm to convert that.
                       CHAIRMAN APOSTOLAKIS:  I think Steve and
           his colleagues can do some sensitivity studies for us
           --
                       DR. KRESS:  They might, they might.
                       CHAIRMAN APOSTOLAKIS:  -- by taking tables
           like this --
                       DR. KRESS:  They haven't done this yet.
                       CHAIRMAN APOSTOLAKIS:  Well, because they
           are overwhelmed, but they can do it.  They can do it. 
           They can do these calculations, and you never know. 
           Maybe you'll find that two whites usually lead to the
           same changes --
                       DR. KRESS:  It not just a matter of doing
           some calculations that are sensitivity.  It's a
           missing algorithm that's a correlation.  It's a
           correlational algorithm between these things that's
           missing.  It's not just a matter of doing some
           calculations.
                       DR. POWERS:  But, Tom, it's not missing. 
           It's maybe implausible to create?
                       DR. KRESS:  Pardon?
                       DR. POWERS:  It may be impossible to
           create.
                       DR. KRESS:  Maybe.  That's my point.
                       CHAIRMAN APOSTOLAKIS:  It could be.  It
           could be.
                       DR. KRESS:  And so you have to do
           something in its stead.  And I don't know what that
           something is, but you have to make some reasonable
           assumptions or reasonable approximations that are
           maybe bounding or maybe a little more conservative
           than you might want.
                       CHAIRMAN APOSTOLAKIS:  But I can take it
           the other way.  What if I were doing something
           negative?  If I take Table 4-2(a) and pick two whites
           or a white and a yellow from sequence 5 and sequence
           20 and I calculate those delta CDF, assuming
           everything else is the same, and I find it's X.  Then
           I take another white and another yellow and I find
           that the new delta CDF is 20 times X.  Then I know I
           have a problem with the Action Matrix.
                       DR. KRESS:  Well, that's something --
                       CHAIRMAN APOSTOLAKIS:  That's a negative.
                       DR. KRESS:  That doesn't tell you how to
           deal with it.
                       CHAIRMAN APOSTOLAKIS:  No.  But it tells
           me I --
                       DR. KRESS:  It tells you you have a
           problem.
                       CHAIRMAN APOSTOLAKIS:  Which I don't know
           right now.
                       DR. KRESS:  But I already know you have a
           problem.
                       CHAIRMAN APOSTOLAKIS:  I don't know that
           I have a problem, because these guys come in here and
           say it's a professional judgment; this makes sense. 
           But this will be definite proof that you have a
           problem.
                       DR. KRESS:  Well, that would be
           worthwhile.
                       CHAIRMAN APOSTOLAKIS:  And then Steve will
           come back and justify it.
                       DR. KRESS:  Then I would say, "I told you
           so."
                       MR. MAYS:  As soon as you sign the check,
           George.
                       (Laughter.)
                       CHAIRMAN APOSTOLAKIS:  Now, Steve, we're
           really running out of time, and I trust that you can
           summarize your presentation.  Still got the letter?
                       MR. MAYS:  Yes, we do.  Let me go to move
           down to -- I will go with two things.
                       CHAIRMAN APOSTOLAKIS:  This is a wonderful
           table, by the way.
                       MR. MAYS:  Thank you.
                       DR. KRESS:  Yes, that's a good table.
                       CHAIRMAN APOSTOLAKIS:  It really is.  See,
           again, I can't resist this.  Why didn't we do all this
           work before revised the oversight process?
                       MR. MAYS:  Actually, we were putting
           together a program, as you're aware of.  From the
           beginning, we came down in 1995 and spoke to the ACRS
           about our plan for risk-based analysis reactor
           operating experience, and we laid out a matrix at that
           time that said here's the stuff we're trying to get
           data on, on a plant-specific basis and across systems
           and components and things to say this is the
           information we would use to be able to understand risk
           implications of operating experience.  So we've been
           working on this since 1994 and 1995 time frame to get
           the basic methods, models, data, and information
           together to be able to do this kind of thing.
                       Now, the Reactor Oversight Program
           development and the crisis that came about in the
           summer of 1998, I guess, came here and that helped to
           provide an impetus for doing an oversight program that
           was more along the lines of what we were working on
           here.  And we're just continuing to try to push that
           envelope a little bit more as we get more data, more
           capability, and more information.
                       Because, remember, the thing we're trying
           to do here is go to progress, not perfection.  We
           don't want to end up in the old source term problem
           where you have a source term that had a generation
           coming from a need and then subsequently you might
           have 20 years of research to get more technically
           capable and competent understanding of the source
           term, but you couldn't change it until you got one
           that everybody thought was more perfect.  So we're
           trying to avoid that problem here.  We recognize that
           there are places where this doesn't do everything you
           might ever want to do.  But we believe it's --
                       CHAIRMAN APOSTOLAKIS:  You're not implying
           that the Committee does not appreciate the distinction
           between progress and perfection.
                       MR. MAYS:  No, I'm just saying that we
           have to make sure we keep that in mind as we go
           forward.
                       CHAIRMAN APOSTOLAKIS:  The Committee does
           keep that in mind, just as the Committee understands
           what engineering approximation is.
                       MR. MAYS:  Let's go to -- I want to go to
           the alternate approach.
                       CHAIRMAN APOSTOLAKIS:  I think you should
           highlight some of the good stuff you have here and
           tell us what you are trying to do.
                       MR. MAYS:  Okay.  I want to go to the
           alternate approach thing, because we've bumped into it
           a few times, and I want to talk about that a little
           bit.
                       One of the things we got a lot of comments
           on was the excessive number and increase in the PIs
           implicated by potentially adopting these.  And the
           major limitation that drove us to the number of PIs
           that we've done was a philosophy that says that you
           are going to set thresholds at the basis of where you
           were collecting data.  That's the way it had been done
           in the past.  That's the way it was in the reactor
           oversight process.  And we were making our first
           attempt at risked-based performance indicators using
           that.
                       What we subsequently decided to do was to
           go back and re-look at that and see if we could come
           up with a different concept that would reduce the
           overall number of indicators but still keep the
           fidelity towards risk that we were having in the RBPI
           process on a plant-specific basis.
                       So what we did was we said -- let's go to
           this Figure 1 now.  If we break core damage frequency
           down into two major groups, the initiators and the
           mitigation, you can subsequently break those down into
           some general categories, such as transients, LOCAs,
           and special initiators for the initiating events.  And
           you can break mitigation systems down, generally, into
           functions like reactivity control, heat removal, feed
           and bleed, recirculation, which are the kind of
           general terms people talk about when they make
           functional event trees or talk about risk assessments.
                       So let's go to Figure 3 now.  What we did
           was we said let's reevaluate the concept a little bit. 
           So what we would do is we'd take the same inputs that
           we were having for individual risked-based performance
           indicators in the Phase I report, and we said let's
           put them into a more complicated, a higher level
           functional model, and then compare the sequence
           changes in core damage frequency that we would get by
           exercising that model.
                       So we did that.  This is work we've done
           since the Phase I draft report was published.  And
           what we came up with was three potential hierarchies
           that we could do these indicators for.  One of them
           was at the cornerstone level.  So we would say -- we
           would have one indicator for initiating events and
           mitigating systems that would represent the overall
           impact of all of the changes for the data that we were
           gathering for the individual indicators.  So we would
           have an indicator that said for mitigating systems,
           whatever the changes were in reliability, whatever the
           changes were in availability for all the systems, we'd
           integrate them together through the risk model and say
           what was the net change in core damage frequency
           associated with that performance.
                       Now, the advantage there is we now have
           the integration you were talking about earlier.  Maybe
           one system's unavailability went up, and maybe another
           one went down.  Maybe certain performances went
           differently.  But we would now have an integrated
           approach to doing that.  And we would have an
           indicator at the cornerstone level of the reactor
           oversight process.
                       CHAIRMAN APOSTOLAKIS:  And, again, an
           alternative to that is not to worry so much about the
           indicator, where to put the indicator, to keep the
           indicators at the lower level, but have the Action
           Matrix take care of these things.  In other words, as
           you enter the Action Matrix, if you have a change in
           a mitigating function that I can measure its impact on
           the CDF, then I react differently than if I had just
           something else.  So there is a combination there that
           it's not just where I put the indicators.
                       MR. MAYS:  The other thing we looked at,
           doing the same approach, was to -- we looked at
           putting together functional-level indicators, and we
           chose two groups to try this out on.  One group was by
           initiators.  So we would, for example, say four
           transient initiators, what is the impact of all the
           different variations in the mitigating systems on
           those sequences associated with transients.  And then
           we'd have another indicator for those sequences
           associated with loss of off-site power.  And we would
           have another indicator for those sequences associated
           with loss of coolant accidents.  So we found that we
           could go back through the models and put an indicator
           where we would have three to five indicators per plant
           that were more rolled up and more integrated at a
           functional level, although less integrated than at the
           cornerstone.
                       And then the last level was down at the
           component or train level where we had already done
           work in the risked-based performance indicators.  And
           at the Subcommittee, one of the things that was
           brought up was, "Well, why don't you just do them at
           all these indicators?  Why don't you have maybe an
           official indicator at the cornerstone level and you
           have functional or component level indicators so that
           you can understand when you have a non-green
           performance what specifically it was about it that was
           non-green and what was actually causing that condition
           to be done?"  That's a potential possibility that we
           could do.
                       We were looking for some advice based on
           that concept in the stuff we showed at the -- that's
           in the package here as well that we showed to the
           Subcommittee as to whether you thought that was
           something we ought to pursue, that ought to be in the
           Phase I report or something we ought to take more time
           and maybe put in Phase II of the RBPI development.  So
           we're looking for some input on that.
                       But we think we have models that if we
           take this data, we can evaluate risk performance on a
           plant-specific basis at whichever level we choose to. 
           I think that's the thing you should be taking away
           from this.  And the question of what's the right level
           to do is something that would have to be negotiated
           with the industry, the other external stakeholders,
           and the public to say what makes the most sense as an
           improvement on the existing reactor oversight process. 
           And there's pluses and minuses for each of them, which
           we discuss in some of the other slides here.
                       So having done that, we also had a meeting
           -- I want to go back to this industry one -- we also
           had a meeting with the public the week after we had
           the Subcommittee meeting, and this is a summary of
           some of the issues that they thought were important. 
           I think we also presented the alternate approach at
           that meeting, and these are the issues that the
           industry folks raised during that meeting.  I think
           these are primarily issues associated with how do we
           implement this stuff and what are the implications to
           plants in terms of the regulatory responses if we were
           to implement a process like this.
                       I think the oversight process/change
           process is supposed to be able to be the place where
           we evaluate and make assessments on that.  The thing
           I'm looking for from the ACRS in terms of a letter, we
           want to know what your feelings are about whether this
           looks like it's a potential benefit to the reactor
           oversight process that should be pursued.
                       DR. POWERS:  Steve, let me ask you this
           question:  Suppose we did something like this and
           suppose I'm a member of the public and I say, "Gee,
           how do these guys have three fire barrier penetration
           seals out of commission?  It sounds pretty hazardous
           to me, but they tell me it's a green finding."  That's
           what you will tell me that it's a green finding.  And
           I say, "I wonder how in the world did they arrive at
           that conclusion that it was a green finding?"  Am I
           going to be able to figure out how you got to that
           being a green or am I going to have to take that on
           faith?
                       MR. MAYS:  I think with respect to what
           we've done with risked-based performance indicators,
           we will have the capability out there in the public
           domain for somebody to duplicate our analysis and our
           work.  I mean not every single member of the public
           will be able to do that, but I mean we'll have the
           information out there so that people will be able to
           do that.
                       The case you're representing would be from
           the significance determination process.  I'm not as
           familiar with the specifics of the fire SDP, but my
           understanding is the logic and the framework for if
           you have this condition, we characterize it this way
           and that causes a result to come out would be
           available and open to the public.
                       But I think what you're raising is a
           larger question.  And the larger question is, how, as
           an agency, do we communicate risk importance to the
           public and in what context do we do that?  That is a
           significant challenge that we faced for a long time. 
           I agree it's something that we can improve on.
                       I'm not sure what exactly that form should
           be, but I agree we're going to have problems in that
           area with any new oversight process that we've come up
           with.  And people need to be able to have some sense
           of feeling of what does the green mean, what does the
           white mean, and how do I know what the implication of
           that is to me?  Now I believe the Oversight Program
           tried to do that in SECY-99-007 and in the NUREG that
           they issued, which was the summary of that, but I'm
           just not in a position to really say much more than
           that.
                       DR. POWERS:  It's a very thorny problem,
           and I choose fire protection, because fire is one of
           those hazards that nuclear power plants face that's
           very palpable to any individual.  I mean you just know
           fire is a bad idea, and you kind of know what it's
           going to do.  And so when you see failures in the fire
           protection system, some of those are very familiar. 
           They're unlike pressurizers or high-pressure injection
           systems.  You have many of them in your own house or
           your own business that you work at.
                       And so you see failures of these things. 
           You say, "Gee, that ought to be significant.  I would
           do something about that in my own business if I saw
           those fire penetration seals failing."  And it's not
           the licensee is not doing something about it.  It's
           that the regulator doesn't feel like he needs to do
           anything about, because he finds it a green finding. 
           But that' not very easily communicated to an
           individual who has been cautioned to worry about
           nuclear power plants.
                       I mean it came up today in the meeting
           with the Commission.  I think it's an area that we
           can't continue to say, "Gee, that's a problem we're
           going to have to address one of these days."  We've
           got to address it.  And it seems like you have the
           vehicle for doing it.
                       CHAIRMAN APOSTOLAKIS:  I just don't think
           that's a problem.
                       MR. MAYS:  Well, I think we have a --
                       CHAIRMAN APOSTOLAKIS:  Why don't you tell
           the public whatever that means that green means that,
           look, this is a major industrial facility.  It has
           40,000 components, it has 800 people working on it. 
           Little things happen here and there.  By design and
           regulations and so on we have allowed for these, and
           in this particular case our analysis shows that it has
           an insignificant impact.  What's wrong with that?
                       MR. MAYS:  Well, I think you're touching
           on one of how might one go about doing that, and I
           think my interpretation of Dana's question is do we
           have an agency process for making sure that that kind
           of communication takes place in a consistent way so
           that people have an understanding of that?  That's the
           age-old question that Chauncey Starr raised years ago
           in his "Perceptions of Risk" Paper.
                       And I think Dana's correct, every person
           in their house can say, "Oh, I know fire's a bad
           thing."  I had a fire in my kitchen once.  But nobody
           understands what the issue about the availability of
           the high-pressure core spray pump, because they don't
           have any high-pressure core spray pumps.  Maybe they
           can make an analogy to their sump pump in their house
           or something, I don't know.
                       But I think risk communication is an
           important feature that we have to be able to do as an
           agency in order to meet our strategic goals for public
           confidence.  I just don't think --
                       CHAIRMAN APOSTOLAKIS:  The reactor
           oversight process, I thought it was very good.  Have
           you guys seen this?
                       MR. MAYS:  Yes.
                       CHAIRMAN APOSTOLAKIS:  No, I know you
           have.  Have you gentlemen seen it?
                       MR. MAYS:  Thanks a lot, George.
                       (Laughter.)
                       MR. BOYCE:  If I could use that as a segue
           a minute.  We have to face this issue in the reactor
           oversight process today, how do you communicate SDP
           results in a coherent manner that's understandable? 
           And what you have to look at -- it primarily comes
           down to the web page really.
                       DR. POWERS:  Even to very technically
           sophisticated people, how do you communicate the SDP
           results?
                       MR. BOYCE:  That's exactly right.  And we
           have that problem.  And the primary vehicle, actually,
           turns out to be the web page for everybody.  And
           everybody includes intervenor groups, casual members
           of the public who are browsing from America Online,
           licensees, staff members --
                       CHAIRMAN APOSTOLAKIS:  Do you have any
           data that showed you -- give you some idea of how many
           members of the public actually do this?
                       MR. BOYCE:  Actually, yes.  If you go onto
           -- in fact, you can access it from the internal web
           yourself.  If you go onto NRC's home page, there's a
           spot there that says, "Web Statistics."  And it will
           tell you -- it's actually pretty good.  It's a
           contractor program --
                       CHAIRMAN APOSTOLAKIS:  What does it tell
           you?
                       MR. BOYCE:  -- that collects data on I
           guess it's the domain names that have accessed the
           pages, the entrance page, the exit page, the number of
           hits on a page, and that sort of thing.
                       CHAIRMAN APOSTOLAKIS:  But that doesn't
           tell you that these people were public.
                       MR. BOYCE:  Well, what you end up doing is
           you find out that they come from aol.com, and you find
           out they come from nrc.gov, and you find out that they
           come from dot-org.  And, so you can get a rough idea
           of the usage.
                       CHAIRMAN APOSTOLAKIS:  Oh, you know that. 
           Okay.  So there are some data.
                       MR. BOYCE:  Yes, from the domain names.
                       CHAIRMAN APOSTOLAKIS:  So there is a
           significant number of hits from --
                       MR. BOYCE:  From the members of the
           general public.
                       CHAIRMAN APOSTOLAKIS:  -- a basis where we
           might suspect there is public involved?
                       MR. BOYCE:  Well, yes.  As a matter of
           fact, one of the -- it's interesting that whenever we
           issue a press release, the number of hits spikes on
           our web pages.
                       CHAIRMAN APOSTOLAKIS:  Whenever you do
           what?
                       MR. BOYCE:  Whenever we issue a press
           release.
                       CHAIRMAN APOSTOLAKIS:  Is that right?
                       MR. BOYCE:  The number of hits spikes. 
           And it comes from places like America Online.  The
           geographical --
                       CHAIRMAN APOSTOLAKIS:  But it could be
           inside NRC?
                       MR. BOYCE:  It may very well could be.
                       CHAIRMAN APOSTOLAKIS:  I mean those guys
           are professionals.  I don't count them as public.
                       DR. WALLIS:  Well, the press releases is
           attractive, because it might be understandable.  I
           think a hit doesn't mean that the person who hit
           understood what he read.
                       MR. BOYCE:  Correct.
                       DR. WALLIS:  That's the problem I think
           you have.
                       MR. BOYCE:  Correct.  And trying to bring
           it back to where we are, the web pages is our primary
           vehicle for communication right now.  And what we have
           tried to put on it is this colorized scheme to make it
           easier to understand.  And we put all our inspection
           reports by cornerstone on the web page so that you
           start off with a color, and if you have a white color
           or yellow color, you can click on the color and you
           get down to the next level of detail.  The next level
           of detail would be perhaps an NRC assessment letter
           saying, "We've reviewed your performance over the
           previous year, and this is our assessment."  If you
           want to know about a specific topic, like an
           inspection finding, you click on that color.  It will
           take you down to the inspection report, which talks
           about the NRC's view of that.
                       We're getting to the point where we're
           putting our, what we call, SDP letters on the web so
           that all the information and how we characterize it
           will be there.  I'm not going to tell you it's
           perfect, but it's what we're doing today.  We've
           gotten additional -- we had a public communication
           session as part of our lessons learned workshop at the
           end of March, and we got a lot of feedback that we
           needed to do better in this regard.
                       So we're at the forefront telling you what
           we're doing.  We can't solve the world's problems, but
           here we are.
                       CHAIRMAN APOSTOLAKIS:  Steve, is there
           anything else that you think you should point out to
           the Committee?
                       MR. MAYS:  No, I think the key thing that
           I want you to come away with is that we have the
           ability, using readily available data and models, to
           be able to estimate plant-specific performance impacts
           on risk in several areas that are broader, more
           comprehensive, and can be integrated, using the
           alternate approaches we're proposing here, to give us
           indication of performance at various levels.  And if
           this is something the Committee thinks we should go
           forward with, we would appreciate hearing about it. 
           If there are aspects of how we're doing it you'd like
           us to do different, we'd like to hear about that too.
                       I think realistically it's going to take
           a considerable amount of time to meet with the
           external folks, go through process, because this is
           going to be primarily voluntary process to do.  And
           we're going to have to show people what we have,
           examine the stuff in a bigger picture than just what
           the technical stuff is.  But I want the implications
           of what it will mean to you.  But that's specifically
           what the Reactor Oversight Process Change Program and
           procedure is designed to do.
                       CHAIRMAN APOSTOLAKIS:  You said that your
           so-called alternative approaches are described where?
                       MR. MAYS:  They're only the presentation
           we made to you on the Subcommittee and the stuff
           that's in this particular thing.  They are in the
           report, because we got these comments after the draft
           report was put out, and we went to be proactive rather
           than just sitting on our hands until the comments came
           in that says, "That's too many PIs."  We said, "Well,
           what other things, since we know that issue, can we go
           work on now?"  And what we've done is we said, "There
           are some things that we could do that can solve some
           of the problems we've had in other areas."
                       Because one of the things we found, for
           example, in the ROP comparisons in this, when did the
           integrated look, we found that sometimes we would
           have, on an individual PI basis, a green and a white. 
           And when you get to the integrated, it comes out
           green, because the green had improved so much and it
           was on the same sequence as the white, it basically
           counteracted it.  And on the other hand, we found
           cases where we had green and green indicators, and you
           put them in the integrated indicator and they come out
           white, because they were green, but they were both
           getting worse at the same time.  So even though one
           individual didn't cross over an individual threshold
           together, they would have crossed the threshold.  I
           think that's an important -- from my risk perspective,
           that's an important piece of information to have.
                       CHAIRMAN APOSTOLAKIS:  It's very
           important.
                       MR. MAYS:  And we also had -- in the ROP
           comparison stuff, we had examples where the ROP would
           indicate one color, and we would see worse and other
           cases where the ROP would say worse, and we would see
           green.  And we were able to go back and look at each
           one of those specific cases and look at them from the
           standpoint of what's making this true and that face
           validity test, which we used in the slides, we were
           able to come to a reasonable conclusion from a risk
           perspective of why that really was true.
                       For example, we were using a plant-
           specific threshold instead of a generic threshold. 
           For example, we weren't averaging diverse trains; we
           were using individual trains.  So those were all the
           kinds of things we found that I think tell me, anyway,
           we can do a better job of understanding risk
           performance with this process than the current ROP. 
           And, again, progress not perfection.  That's not
           saying it's broken and dead and is wrong.  We're
           saying what we have here is potentially better.
                       CHAIRMAN APOSTOLAKIS:  What you're saying
           -- I'll give you an example for me to understand it
           better.  A particular indicator of the plant may
           formerly be yellow but because the utility is aware of
           it and they're doing something else better, the
           overall impact may be zero, right?
                       MR. MAYS:  Well, the impact may be white,
           it may be green, it may be still yellow, I don't know. 
           What I'm saying is without an integrated model you
           can't tell.
                       CHAIRMAN APOSTOLAKIS:  And you have the
           tools to investigate.
                       MR. MAYS:  I think we have the tools to
           investigate that.
                       CHAIRMAN APOSTOLAKIS:  Speaking of tools,
           Steve, do you also have tools to test the hypothesis
           that if human performance and the safety culture of
           the plant deteriorates, then we will see the impact on
           the equipment decline in performance?
                       MR. MAYS:  We have the tools to determine
           when we see degradations in the performance of the
           equipment, whether or not the factors causing that
           were related to Corrective Action Program or other
           things.  We don't have tools to directly measure
           Corrective Action Program and then posit what the risk
           impact would be.  So if you were to look at public
           risk and make yourself a hierarchy, here's public
           risk, and then somewhere below public risk is core
           damage risk, and somewhere below that is system or
           train level performance, and somewhere below that is
           component performance.  I think what you see is that
           the safety culture is somewhere below that in terms of
           being how leading you want to get from public risk
           down to the least level of detail that you might be
           able to do.
                       I don't have metrics to link safety
           culture measures that --
                       CHAIRMAN APOSTOLAKIS:  But do you have
           tools?
                       MR. MAYS:  I have tools to be able to see
           when I see a performance degradation at the lower
           levels of risk to be able to go back and examine
           whether the fundamental causes of that were safety
           culture, corrective action or other problems.
                       CHAIRMAN APOSTOLAKIS:  So maybe that's
           something different.  Maybe it has to do with root
           cause analysis.
                       MR. MAYS:  Correct.
                       MR. HAMZEHEE:  If the impact is on the
           equipment performance.
                       MR. MAYS:  Right, if the impact is on --
                       CHAIRMAN APOSTOLAKIS:  Don't give me
           cryptic statements, Hossein.
                       MR. MAYS:  If the impact were to be --
                       CHAIRMAN APOSTOLAKIS:  What else could it
           be?
                       MR. MAYS:  Well, for example, on the
           ability of the operators to respond to an accident. 
           So we don't have data on being able to make sure that
           you initiate slick within five or ten minutes after an
           accident.  So we don't have that kind of data either.
                       CHAIRMAN APOSTOLAKIS:  Okay.
                       MR. BOYCE:  The Allegation Program does
           compile statistics at an industry level.
                       CHAIRMAN APOSTOLAKIS:  But we are not
           using those to confirm this hypothesis.
                       MR. BOYCE:  Correct.  In terms of tools,
           it's not a tool, but that's at least the best
           indicators we have for a safety conscious work
           environment.
                       DR. WALLIS:  George, you never confirm my
           hypothesis; you just disprove it.
                       CHAIRMAN APOSTOLAKIS:  Yes, yes.  I stand
           corrected.  Thank you, gentlemen.  This was a very
           lively session; appreciate it.  Are you happier today?
                       MR. BOYCE:  I was able to respond better
           to your questions today, which does make me happier.
                       CHAIRMAN APOSTOLAKIS:  Okay.  Thank you
           very much.  Now we will hear from Mr. Houghton of NEI.
                       MR. HOUGHTON:  Good afternoon.  My name is
           Tom Houghton.  I am the Project Manager for the
           reactor oversight process at NEI.
                       DR. KRESS:  This is your first test to see
           if you can turn that on.
                       MR. HOUGHTON:  First test is -- okay. 
           Well, I think I have four slides here, and I've tried
           to summarize a lot of points on these.  We do support
           movement towards risked-based performance indicators
           with some caveats.  And the caveats, a very important
           one, depends upon the ability to integrate what's
           going on across the different aspects of regulatory
           space.  And by that I'm particularly talking in the
           mitigating area to the dichotomy between design basis
           technical specifications and risked-based performance
           indicators.
                       And it plays a big role, because the
           inspectors inspect to the design basis, and if we're
           trying to move towards risked-based performance
           indicators, we're shifting the focus of this
           performance indicator.  The performance indicator's
           purpose is not to measure risk.  performance
           indicator's purpose is to help the NRC manage its
           resources and determine where to put its inspection
           resources.  So the inspectors are aiming at design
           basis, i.e. the automatic function would not have
           worked.  And the risk-based indicator allows operator
           recovery, because the mission time is seven days, and
           there is seven days to restore the function.
                       We have a big dichotomy here.  And we're
           seeing that already between the Maintenance Rule and
           the tech specs.  And we'll see it even more in the
           risked-based performance indicators unless we address
           this problem up front with a plan that solves the
           problem so we're not having people going in different
           directions.  And that really is a key issue in going
           forward with risked-based performance indicators.
                       Second point I put on here is the PIs and
           the inspection findings, their aim is to tell the
           inspectors how much additional inspection to do beyond
           the baseline.  And, therefore, the indicators need to
           provide that value at the same time not adding
           additional burden and to help us all focus on what's
           risk-important.  Now, I said complement inspection
           activity, I didn't say reduce.
                       DR. WALLIS:  This refrain about avoiding
           unnecessary burden, there's always a complementary
           side.  When additional burden is appropriate it should
           be there.
                       MR. HOUGHTON:  Absolutely, absolutely. 
           Now, I agree with you, but to do that, one needs to
           look and say, "Okay, the current number of hours in
           the baseline inspection is actually slightly higher
           than it was."
                       DR. WALLIS:  You really should say the
           regulatory burden should be appropriate.
                       MR. HOUGHTON:  Yes, yes.  It should be
           appropriate.  To do that, one needs to ensure that
           additional reporting falls under 50.9 and has to be
           accurate to very fine levels is appropriate for the
           amount of effort people are going to have to put into
           that.  And we do have inspectors that have gone down
           and looked at 15 minutes of availability time, of the
           time that was written in the log, as opposed to
           something else.  And it can cause a lot of inspection
           effort by the NRC and by the licensees unless we're
           careful.  And by adding additional indicators, we add
           to that area.  So that we would say, let's add more
           indicators, but let's have a tradeoff here.  And if
           there is no tradeoff, then there's no advantage to
           doing it other than to gather more information to what
           purpose.
                       This 0609 Manual Chapter is the chapter
           that tells NRC how to proceed with interpretations of
           performance indicators, and we've had about 256
           questions over the year on interpretation of
           indicators, mostly in the mitigation unavailability
           area.  But it tells them what process to go through. 
           And I think although research is -- as I understand
           it, research's duty here is to look at the technical
           feasibility, but we're looking ahead to see if these
           indicators are practicable to be used, okay?
                       So we're looking at those aspects, okay,
           easy to understand.  We would wonder an indicator
           which rolled up, either to a cornerstone or at a
           higher level and how difficult that would be for
           someone to be able to readily understand.  I mean you
           may not know what a high-pressure injection system is,
           but you know it's a system.  If you're talking about
           the cornerstone of initiating events, that's an
           abstraction.
                       I think I covered the other points there,
           but the 608 is important.
                       DR. POWERS:  Could you explain the title
           of the slide?  The title has me confused.
                       MR. HOUGHTON:  Oh.  I'm glad you asked
           that, because I should have discussed that.  The
           purpose of the performance indicators and the
           inspection findings is to help determine where
           management should put resources.  And we basically
           have three stakeholders.  We've got the Regulatory
           Commission, which needs to assign resources, we've got
           the industry, and we've got the public.  And our
           feeling is is that you have to -- these indicators
           have to meet the needs of all three stakeholders in
           this process.  So you can't have extremely
           sophisticated indicators, you can't have indicators
           that are hard to collect accurately, and you've got to
           have indicators that are actionable by the Commission
           and by the people that are living with them.  That's
           my point.
                       DR. POWERS:  I understand better now. 
           First I thought we were talking about producing
           electricity.
                       MR. HOUGHTON:  I'll hold it at the ROP
           level.
                       Some comments on the draft PIs themselves,
           and I think this is partly understanding and working
           through.  But the thresholds need to be set at
           practical levels for action.  That may vary from what
           a very strict risk study tells you.  So if I were to
           look, for example, at the loss of heat removal
           threshold for one of the plants in the study, you'd be
           allowed 0.7 reactor scrams with the loss of heat
           removal in a three-year period.  That means your
           threshold is less than one.  That means there is no
           threshold.  There are some difficulties in going
           strictly by a risk-based threshold system.  It needs
           to be modified to be practicable.
                       Another example for you is the general
           transient green/white threshold, as I looked through
           the plants that were reviewed, varied.  One plant
           would have a threshold of 1.2 general transient scrams
           per year; another one would have 8.2.  Now if I'm a
           plant manager and I have two scrams in a year and I
           get an extra inspection and I get a mark of a white,
           and my neighbor has eight scrams in a year, and he is
           considered in the green band, that doesn't make sense. 
           It just doesn't make sense.  It has to be an indicator
           which is --
                       CHAIRMAN APOSTOLAKIS:  No, but -- well, in
           all fairness, if you're running a plant where the
           threshold is two, there must be some serious reasons
           why.
                       MR. HOUGHTON:  No, I think it -- and I
           defer to the risk experts, but the threshold is based
           on a ten to the minus six delta CDF.
                       CHAIRMAN APOSTOLAKIS:  Yes, but that is
           converted for your plant to a threshold of two, which
           means you don't have enough mitigating capability,
           right?
                       MR. HOUGHTON:  But this PI won't get at
           that problem.
                       CHAIRMAN APOSTOLAKIS:  So you should pay
           the price.  That's the way I see it.
                       MR. HOUGHTON:  But the PI won't get at
           that problem.
                       CHAIRMAN APOSTOLAKIS:  No, the threshold
           gets at the problem.
                       DR. POWERS:  For your first example, you
           just want that to be one per four years, is that all?
                       MR. HOUGHTON:  Well, right now we've
           combined the loss of heat sink and the loss of -- the
           current indicator combines the loss of heat sink and
           loss of feedwater, and the green/white threshold is
           actually two.  And there are several plants that have
           tripped that threshold, and they've done extensive
           cause analysis for the situation.  But that isn't a --
           I'm pointing out that there are practicalities that
           need to be -- you can't blindly use a risk-based
           approach.
                       CHAIRMAN APOSTOLAKIS:  And what I'm saying
           is that, you know, of course you should be practical,
           but at the same time, there is a reason behind this. 
           And maybe a plant that is very well defended can
           afford to have maybe a couple more transients a year. 
           Whereas another one that is not may be should not.
                       DR. POWERS:  Wait until you see the kind
           of performance indicators we were talking about
           earlier that are very much more complex.  I mean then
           you'll have some real problems with that practicality.
                       MR. HOUGHTON:  But I would think --
                       CHAIRMAN APOSTOLAKIS:  No, I appreciate
           the point, but I want you to appreciate mine.
                       MR. HOUGHTON:  Yes, sir; I do.  But I
           would say is that the venue for the discussion of
           whether you have a robust enough mitigating system is
           not the ROP, because the ROP is looking at your
           performance under the current rules and regulations
           and activities you're supposed to do.
                       CHAIRMAN APOSTOLAKIS:  Anyway, you're
           talking about the draft PIs as given by these guys
           today?
                       MR. HOUGHTON:  Yes, yes.
                       CHAIRMAN APOSTOLAKIS:  Okay.
                       MR. HOUGHTON:  And we went through these
           discussions, actually, when we set the thresholds in
           the ROP, because we did have differences such as
           these, and what happened was is there was an
           accommodation of, "Well, let's use three scrams per
           year.  Even though one plant is 2.1 and another is 7,
           we'll use three, because what we're really trying to
           do is look at are you maintaining and operating in an
           effective manner."
                       The mitigating systems, the most important
           issue, as I said to you several weeks ago, is the
           unavailability definition.  And as I just said, it
           gets into issues of design basis versus risk basis. 
           It gets into credit for operator action.  It gets into
           cascading of support systems and whether we do that or
           not.  And it gets into the reliability indicator in
           place of demand fault exposure.  And we're very -- we
           support very much working towards, moving towards a
           more risk-based approach in this area, because we
           think that's appropriate, and it's more in line with
           the Maintenance Rule, and it can help to avoid this
           problem.  I talked about it, having two or three
           different targets that you're aiming at.
                       MR. MAYS:  Tom?
                       MR. HOUGHTON:  Yes.
                       MR. MAYS:  If I might, the issues on
           design and licensing basis for unavailability and
           whether or not operator action is credited and role
           support systems and the fault exposure times were all
           issues that in the draft RBPI were done in the
           direction to which you're concerned that we should be
           moving.
                       MR. HOUGHTON:  Yes, I agree, and that's
           what I was trying to say.  We see that as moving
           positively.  However, the tech spec issue is looming
           out there.
                       The component class PIs, we feel that
           better covered by the SDP and by the extent of
           condition in root cause analysis rather than having
           separate PIs.  We don't feel there's a -- that there
           would be less inspection coming about through having
           PIs in those areas.
                       Now on the shutdown PI, I think there was
           some discussion of the level of our concern in terms
           of the amount of time and basing it on time when we
           think that you could have negative consequences of
           people trying to rush out of conditions.  And it's not
           really appropriate to have indicators like this.  We
           want to hear more about it, but when you look at some
           of the thresholds in that table, you'll find that
           they're very unforgiving, and you can move from being
           green to yellow in just two or three hours.  And when
           you're trying to be careful you don't want to put
           yourself in that situation.  It's not clear that
           that's a good idea.  We do think that it could be very
           helpful in the Phase II once you know you have a
           problem to see what sort of the risk level was.
                       Adding PIs requires examination of the
           Action Matrix.  A comment here:  I heard the talk
           about rolling up PIs to a higher level and then
           putting that in the Action Matrix.  But the Action
           Matrix includes both the inspection findings and the
           PIs.  And the Action Matrix really is more of a logic
           table to tell you if you have two or more -- if you
           have single white, be it a PI or an inspection
           finding, the NRC is going to look at your root cause
           and look at your corrective action.  And it might
           require up to 40 hours of additional inspection. 
           That's what that means.
                       The second column tells you that you have
           two or three white indicators in a particular
           cornerstone, whether that's physical security or
           emergency planning or the barriers or mitigation.  And
           what it's saying is, "We're not so sure you're
           handling things right, and we're going to come and
           look at your ability to do root cause and look at your
           ability to integrate this problem across different
           systems."
                       The next column, a yellow or degraded
           cornerstone says, "You have a more systemic problem
           and we're going to increase the inspection level
           higher."  The next column probably has you in a
           diagnostic, like Indian Point.  There's a very
           interesting inspection report that showed Indian Point
           if it had been under the new system for the year prior
           to the steam generator tube rupture.  And it shows you
           that the Action Matrix in the system would have shown
           a steady degradation and the need for more inspection
           earlier on at Indian Point.  I commend that to you to
           see how that worked, because that's an actual case
           study.  It wasn't applying to them at the time.
                       So we see the Action Matrix as not being
           a risk meter at a certain level, but we see it as
           indications of problems across distinct areas.  And as
           they increase, the Agency needs to take a closer and
           closer look at the problem.  So we would really feel
           that aggregating these PIs you still have to compare
           it with inspection findings, so you're not really
           integrating risk with the inspection findings.  And we
           really think that the PIs at the level they are are
           actionable indicators.
                       DR. WALLIS:  I understand that, but
           remember the public looking in, this is really a risk
           meter, and the public is not interested in a
           management tool.  It's interested in how a state
           complies.
                       MR. HOUGHTON:  Well, but they're
           interested, I think, in how does the NRC judge the
           plant.  And you now can click on the Action Matrix,
           and you can see the 79 plants that are in the licensee
           control band, the 16 or 18 that are in the next one,
           the three or four that are in the next one, and
           finally Indian Point on the site.  And it also tells
           you why they've changed from column to column.  So it
           --
                       DR. WALLIS:  I don't really care about
           that.  If I was a member of the public, I probably
           look at it say, "Well, I want a good feeling that
           these things are safe enough.  Here's a measure I've
           got."  So it's going to be used in some way as a risk
           meter whether you like it or not.
                       MR. HOUGHTON:  Agreed, but I'm not sure
           that -- it's not clear to me that a risk number would
           tell someone more than being told that there are
           systemic problems across different areas.  That's my
           opinion.
                       MR. SIEBER:  It seems to me that that's a
           two-edge sword.  For example, if you could predict the
           declining performance at Indian Point and then begin
           doing diagnostics and additional inspections, that
           probably would not have prevented the tube rupture.
                       MR. HOUGHTON:  Right.
                       MR. SIEBER:  Okay.  So now the Agency is
           called into question.  You knew this Plant was going
           downhill, yet you weren't able to prevent this event,
           even though the two are not associated.  And I think
           you have to be careful about that, because a lot of
           these events are random events.
                       MR. HOUGHTON:  Well, and that's very true
           is that things are going to happen that are not going
           to be caught by inspection, and they're not going to
           be caught by performance indicators.
                       MR. SIEBER:  Yes, there's another effect
           that occurs that I've seen happen in plants is you go
           in with a diagnostic team that lasts three, four, five
           weeks and has five to ten people on it.  That really
           disrupts the operation of that plant.  And I think
           that plant is more vulnerable during that time when an
           inspection is going on from a risk standpoint.
                       MR. HOUGHTON:  They certainly find more
           things.
                       MR. SIEBER:  They certainly do, and it
           ties up management, and it ties up your engineering
           staff, it ties up your licensing people, and it has to
           be done, but it's a cross-cutting issue.
                       MR. HOUGHTON:  Although they might not
           have -- and I think the system, the way it works now,
           does attack cross-cutting issues, because it says, "Do
           I have a problem across different areas?"
                       MR. SIEBER:  Right.
                       MR. HOUGHTON:  Which says, "Does my
           maintenance force have a problem with the Corrective
           Action Program?  Does my training organization have a
           problem with operations experience from other plants?" 
           So that it does give you a feeling of whether there
           are problems across different aspects of the
           organization, which rolling up, to me, doesn't quite
           give --
                       MR. SIEBER:  Thank you.
                       MR. HOUGHTON:  Other questions for me? 
           Appreciate the opportunity to talk to you.
                       CHAIRMAN APOSTOLAKIS:  Thank you very
           much; appreciate it.
                       Now, we will not need transcription after
           this point.  And tomorrow afternoon, actually, we'll
           see you again at 1:30 when we discuss the general
           design criteria.  Because in the morning there is no
           need for transcription.
                       (Whereupon, at 3:32 p.m., the NRC Advisory
           Committee Meeting was concluded.)
           
	 
 

Page Last Reviewed/Updated Monday, August 15, 2016