Plant Operation & Fire Protection

                       UNITED STATES OF AMERICA
                     NUCLEAR REGULATORY COMMISSION
               ADVISORY COMMITTEE ON REACTOR SAFEGUARDS
                                  ***
              MEETING:  PLANT OPERATION & FIRE PROTECTION
                                  ***
     
     
     
                        U.S. Nuclear Regulatory Commission
                        Region I Office
                        475 Allendale Road
                        King of Prussia, Pennsylvania
     
                        Wednesday, June 23, 1999
     
         The Subcommittees met, pursuant to notice, at 8:32 a.m.
     
     MEMBERS PRESENT:
         DANA A. POWERS, Chairman,
           ACRS Subcommittee for Fire Protection
         THOMAS S. KRESS, Member, ACRS 
         ROBERT E. UHRIG, Member, ACRS .                         P R O C E E D I N G S
                                                      [8:32 a.m.]
         DR. POWERS:  The meeting will now come to order.  This is a
     meeting of the ACRS Joint Subcommittees on Plant Operation and Fire
     Protection.  I'm Dana Powers, Chairman of the Subcommittee on Fire
     Protection, and I'm also the acting Subcommittee on Plant Operations,
     because John Barton is unable to attend this meeting because of family
     difficulties.
         The ACRS members in attendance are Tom Kress and Robert
     Uhrig.  I will apologize, we expected a bigger turnout, but a variety of
     things, whether an illness that devastated my subcommittees here.
         Nevertheless, I can assure you that the ACRS maintains an
     excellent capability of internal communication and that the information
     that we're gathering here does get shared with the committee as a whole
     in our July meeting.
         So the fact that we don't have a full turnout for this
     meeting is only a minor difficulty for us and collecting the information
     that we're after.
         The purpose of this meeting is to discuss Region I
     activities and other items of mutual interest, including significant
     operating events and fire protection issues.  The subcommittee will be
     gathering information, analyzing relevant issues and facts, and
     formulating proposed positions and actions, as appropriate, for
     deliberation by the full committee.
         Amarjit Singh is our cognizant ACRS staff engineer for the
     meeting.
         The rules for participation in today's meeting have been
     announced as part of the notice of the meeting previously published in
     the Federal Register on May 19, 1999.
         A transcript of the meeting is being kept.  It will be made
     available as stated in the Federal Register notice.  It is requested
     that speakers first identify themselves and speak with sufficient
     clarity and volume so they can be readily heard.  We have received no
     written comments or requests for time to make oral statements from
     members of the public.
         I will comment to the members that we are arranging to have
     lunch delivered and they have a form here to mark up and they should do
     so immediately, I guess.
         MR. SINGH:  Yes.  Thank you.
         DR. POWERS:  I will also comment that we have in attendance
     Jocelyn Mitchell from the Office of the Executive Director of
     Operations, and John Larkins, the Executive Director of the ACRS, is
     here, as well.
         And I will also comment that the subcommittee members had a
     chance to visit the Susquehanna Power Station yesterday and quite an
     interesting and complete visit to get an understanding of their programs
     and their plant operations.  We found it extraordinarily useful to us to
     visit that site.
         If there are no comments from the members themselves, I
     think we should proceed with the meeting and I will call upon Mr. Hubert
     Miller, Region I Administrator, to begin.  Hub?
         MR. MILLER:  Thank you.  I am Hubert Miller.  I'm the
     Regional Administrator, Region I, NRC.  We are very pleased to have you
     join us in Region I today to review, in somewhat general terms, the
     nature of the work that we do and talk about the issues that we face in
     managing the region, but also to address a number of specific questions
     and interests that the ACRS has regarding the reactor program that we
     have responsibilities for in the region.
         DR. POWERS:  One of the issues that is very much on the mind
     here of the committee is how things occur, going to the regions and how
     we maintain some sense of uniformity within regions, at the same time,
     taking advantage of the diversity of applications we have in the
     regions.
         So regions have suddenly become very important in our
     thinking, especially as we move to new processes and assessing and
     evaluating plants.
         MR. MILLER:  And one of the most important things, in fact,
     in managing regions is ensuring that the regions are, in fact, carrying
     out a program that is consistent with the agency policy and direction.
         The issue of consistency among regions has always been an
     issue.  I've served as the Regional Administrator in Region III and I'm
     now the Regional Administrator here, and there are always inexorable
     tendencies, when you're dealing with individuals, to have a certain -- I
     wouldn't say drift, but it takes an active and constant management of
     things to assure that there is consistency in the approaches that are
     taken.
         So we are happy, as we discuss our business today, to share
     with you the specific things that we do here in the region, but, as
     well, things that are done in the headquarters office and the program
     office of NRR and, in fact, the EDO's office, and Jocelyn is from that
     office, what we do at NRC as a whole to assure consistency among the
     regions.
         We're busy in the region and we have a lot of conviction
     about coming to work in the morning, we really do.  We think we make a
     difference and I thought, with your permission, that what we do at the
     outset this morning is to have me introduce, just very generally, the
     organization.
         We're like the other regions.  I'm sure you've seen this
     before, and I won't -- there are a few differences that are differences
     with thought and for cause, and I will point those things out to you. 
     Then I thought I'd also just take a moment and hit a few of the
     highlights, some of our recent accomplishments and what some of our
     challenges are; not in a lot of detail, because much of the presentation
     today will elaborate on these points.
         But maybe where we could start is perhaps even in this --
     what I'm going to be talking from is in the book, but we'll also be
     operating from some slides.  Perhaps the place to start is with a map of
     the northeast, which is the chart that's on the overhead now.
         Region I is comprised of eight states in the northeast. 
     There are 26 operating reactors at 17 sites.  Somewhat significantly,
     there are six sites in Region I that are in a decommissioning phase. 
     The most recent sites, and, of course, you're aware of TMI-2, but also
     Haddam Neck, Millstone 1, Yankee Rowe, Maine Yankee, and Indian Point 1
     are also sites in decommissioning.
         Just as an aside, although the ACRS is not concerned with
     this, in addition to the operating reactors that we regulate, the region
     is responsible for regulating and overseeing some 1,800 materials
     licensees, and that's another big part of our operation.
         You were at the Susquehanna site yesterday and I won't go
     through the long list, but we have a heavy load of operating reactors in
     this region.
         DR. UHRIG:  Do you have any university reactors that come
     under your --
         MR. MILLER:  Right.  The university reactors are regulated
     now by the program office.  That was a shift done several years ago,
     where NRR has responsibility for university reactors.  I believe Penn
     State has a reactor.
         MR. BLOUGH:  Penn State, MIT, several others.
         MR. MILLER:  MIT is another one.  So there are several.
         If I could go to the next chart.  It's a little hard to
     read, but this is the broad outline of the region.  Reporting to me, I
     have a number of divisions.  The major divisions are the Divisions of
     Reactor Projects, I'm focusing now on the reactor program, and Randy
     Blough is the Director of the Reactor Projects Division.
         The Division of Reactor Safety, and this is the classical
     organization seen in all the regions.  Wayne Lanning is the Director of
     the Division of Reactor Safety.  As I mentioned, there is the division
     that's responsible for materials safety, and George Pangburn is the
     director of that division.  And we have the Research Management Division
     and Jim Joyner is the head of that division.
         We have a special situation in Region I, where the Millstone
     plant, due to performance problems at that site, that's a plant that has
     been on the watch list and it has been, in fact, a plant that --
         DR. POWERS:  In the news and everyplace else.
         MR. MILLER:  It has drawn the attention directly of the
     Commission.  Over the past year, there have been a number of changes at
     Millstone.  Millstone, both Units 3 and 2 started up after several years
     of being shut down.
         The Commission authorized the startup of those units, but
     we're still kind of in transition on Millstone and Millstone has not
     been returned to the normal line organization.  In fact, throughout the
     past several years, the region did not have responsibility directly for
     Millstone, but rather Millstone was managed and the project that
     Millstone was managed by, a special projects office -- in fact, Dr.
     Travers, who is now the EDO, was the head of that office -- to provide
     needed attention to Millstone.
         It was a special case and it remains somewhat of a special
     case and instead of being in the Reactor Projects Division, it is, in
     fact, a separate inspection directorate.  The head of that directorate
     is Jim Linville, sitting in the audience.
         In other words, it remains a special case and reports
     directly to me.
         Just briefly, also, in regional offices, and this is the
     same normally in the regions, we have an Office of Investigations, or
     the OI field office.  It doesn't report directly to me.  It's a bit of a
     -- technically -- well, no -- in all respects, really, Barry Letz, who
     is the director of this office, reports to Guy Caputo, in headquarters,
     and the Office of Public Affairs, Diane Screnci, she'll be in to talk to
     you later, she reports to Bill Beecher.
         DR. POWERS:  Oh, really.
         MR. MILLER:  But I will say, and you will talk to at least
     Diane today, I think she'll tell you, as well, that there is very strong
     teamwork between -- and it has to be that way, as you'll learn as she
     talks today, strong teamwork between the Division I management and the
     support function that Diane and her counterparts in the other regions
     play, because the Public Affairs is such a huge, huge issue for us.
         In the Office of the Regional Administrator, we also have a
     technical program staff, Dan Holody is the head of that.  Dan does
     report to me, and he is responsible for enforcement, allegations,
     several very important facets of the business that we do.
         So in a nutshell, that's the organization.  As far as
     reactor safety is concerned, there are the two major divisions, Reactor
     Projects and Reactor Safety, and then the various support functions that
     report into the Regional Administrator.
         Unless there are any further questions on that, I thought
     I'd just take a moment and breeze through a couple of charts.  When I
     say breeze, breeze through, that's probably the right way to frame this.
         We have, over the past several years, I think, been, of
     course, concerned very much with Millstone, with Salem, which is another
     plant that's on the watch list; Indian Point 3 was on the watch list.  I
     came to the region in 1996 and all three of those facilities were
     occupying an enormous amount of our time.
         Maine Yankee, also, while it wasn't on the watch list when I
     first arrived in the region, arrived on the watch list.
         So much of our effort, and this is the second bullet,
     actually, and I'll just talk a moment about this before going to the
     first, but much of our activities and effort over the past several years
     have been aimed at overseeing the recovery of these plants.
         But when I arrived at the region, we were working hard and I
     made much of my folks avoiding additional problems at plants, heading
     off performance decline that would lead to additional plants being
     placed on the watch list.
         So one of the -- what I'll call one accomplishment of the
     region has been to head off other problem situations.
         Indian Point 2 is perhaps the most visible example of this. 
     We sent strong messages to the licensee in that instance, through our
     SALP reports and through the use of -- through escalated enforcement
     actions, and through some special inspections that were performed, we
     brought to light a number of problems that if they had not been
     addressed, we would have had another plant in that category.
         The plant was shut down for a period of time, for almost a
     year really, addressing the issues that were identified.  Initially, I
     would have to say largely by us, and then ultimately by the licensee and
     their own independent reviews.  In fact, in that case, they had an
     independent industry-wide type of review performed, which I think
     validated a lot of the concerns that we had.
         But the point here is that we worked hard, heading off other
     problems.     
         Beaver Valley is another plant that was shut down, both
     units, for almost a year.  I think the significant thing there and
     something that we're very proud of is that the approach that we took in
     that case was to not have the heavy hand of the region and of the agency
     controlling matters at Beaver Valley, but rather to create an
     environment that was conducive to the licensee itself, identifying their
     own issues and problems, and with us certainly overseeing what they were
     doing to assure that they were bringing problems to the surface and
     effectively dealing with those issues.
         We gave them the discretion or the breathing room, if you
     will, to manage their own affairs.  Of course, the caveat in this was
     that they do that, that they manage their affairs well, and they did and
     they have.  Those plants have restarted.
         But the point I'm making here is that in addition to the
     very large issues surrounding Millstone, Salem and the more visible
     plants, we have worked very hard in the region at heading off problems
     and catching performance dips early at other plants and getting those
     dealt with, with success, I would say.
         Much of our effort -- we're very proud of the efforts that
     we've done.  We've made -- we'll talk at length about this throughout
     the day to support the development of the new reactor oversight program. 
     It's been a massive effort for the agency and the regions appropriately
     have been in the thick of that, providing a lot of the staff effort to
     develop the new program.
         We've made great strides with respect to risk-informing our
     activities and there is one whole section on that.  A great deal of
     activity in the region.  We have two senior reactor analysts.  I think
     we perhaps have a senior among the -- I don't know all of the SRAs
     across the country.  Each region has two SRAs, so-called, senior reactor
     analysts.  We have two very experienced people.  Tom Shedlosky, who is
     one of the two, will be speaking to you today, to talk about some of the
     specific accomplishments.  But just overall, I think this is an area of
     great effort.
         Millstone, in many respects, was all about allegations.  The
     agency made some mistakes and the region had made some mistakes in that
     case.  We made a huge effort starting about two years ago to improve
     our, first of all, sensitivity to the importance of allegations and
     handling of allegations, but also doing that in a very timely manner.
         And I think all of the statistics that we use to track
     allegations indicate that we have made significant progress and, in
     fact, at this point, our allegation program is very strong in this
     region.
         Inspector staffing and development, when I came to the
     region, we were under-staffed.  We have hired on the order of two dozen
     to some 30 inspectors over this past several years, and this has been an
     enormous effort.  As you know, there is nothing that is more
     resource-intensive, but more important than staffing and developing and
     training of inspectors, and this has been a huge, huge area of effort
     and I would say accomplishment in the region.
         And then the last bullet under this broad topic of
     accomplishments is that I think we have put a very strong focus on this
     senior management team that you see here at the table.  Being at the
     sites, being involved in the activities in the region, and hopefully not
     micro-managing, but being present and where the action is, to, among
     other things, of course, give guidance, but also to get feedback from
     inspectors and from licensees on our programs.
         I think we've done well at that.
         I want to, just before I go further, I think there have been
     informal introductions here at the table, but Chris O'Rourke is to my
     right.  She is managing the viewgraphs today.  She is from our Office of
     Reactor -- or from the Research Management Division.  Bill Ruland is
     sitting to her left, and Bill is the acting Deputy Director of Reactor
     Safety.  I'm taking a moment to make introductions because we're on this
     topic of senior management involvement and I'm very proud of the
     management team here.
         You've met Wayne Lanning, who is the Director of Reactor
     Safety.  Randy Blough is to my left.  He is the Division Director of
     Reactor Projects.  Jack Crlenjak, Jack is our Deputy Reactor Projects
     Division Director.  Jack came to us from Region II.  Actually, Bill
     Ruland came to us from Region II; Wayne from headquarters, but he had
     been in the region for how long -- eight years.  So he's been here for a
     while.
         Randy was born and bred, I guess, in the region.  Dan Holody
     was in Region III years ago and now in Region I.  Jim Joyner started 45
     years ago -- many years ago in Region II, was it?  Savannah River?
         MR. JOYNER:  I came here from Region II.
         MR. MILLER:  But he's been in Region I for some time.  And
     Tom Shedlosky is the other person at the table.  We'll make other
     introductions as people come to the table throughout the day, if that's
     okay with you.
         DR. POWERS:  Your last bullet says senior management
     involvement and then it has self-assessments on there.  Can you tell me
     more about what kind of programs you have on self-assessments?
         MR. MILLER:  We'll talk more about those today, but the most
     immediate thing is just the assessment we do, as a matter of routine, in
     the observation of field work and -- now, you may be referring to sort
     of formalized self-assessments.  We can give examples of that.
         A lot of that has been in the lessons learned, the lessons
     learned variety, as problems happen, and we can give examples.  We had
     an issue involving, just to pick one that comes to mind, Millstone,
     where we were handling an allegation that had to do with workers who had
     been -- there were a lot of layoffs at Millstone at one point and there
     were numerous people that had filed an allegation of harassment and
     intimidation and discrimination as a result of having raised safety
     issues.
         We had done a -- the agency had done a review of many, but
     not all of those, just resources precluded investigating every case.  I
     think the investigation work was quite thorough.
         At the end of it, when we were done, we wrote the
     individuals and we were a bit inartful, I would say, in the way we
     wrote.  The communication was that we had investigated every case and we
     hadn't.  So it was a question of, well, how did we -- what did we learn
     from this episode, and how you write, importance of communicating in a
     real clear way.  We were not intending to mislead, but it left certain
     impressions.
         So that's an example of a problem that arises and we do
     assess ourselves and do a lessons learned.  
     But we've done other things, looking at operator licensing, and Rich
     Conte will be speaking today about the pilot program in the operator
     licensing area.
         Self-assessments in the allegations and the enforcement
     area, and Dan Holody can talk about self-assessments that we have done
     there.
         So it's really a mix.  But I'll be honest and tell you that
     I find -- and we see this in watching licensees -- that the individual
     stand-alone formalized self-assessment has a place and it's important,
     but it's the self-critical vent and the constant assessment of how
     things are going that, in my mind, has, in the long run, more impact on
     good performance than anything.
         So it's much in that vein.  We track -- we have a set of
     indicators and a management report.  It's a lot like what you see with
     licensees.  One of the things that we track in here is inspector
     accompaniments, oversight visits by branch chiefs, by division
     directors, by division, and this doesn't tell all, but it's somewhat of
     a measure of how much we are out involved in the field observing what is
     going on, getting feedback from the licensees.
         On every visit, a standard question to the licensee and
     other stakeholders, if we deal with other stakeholders, is what feedback
     do you have for us, good or bad, and we fill out a form on all of those
     visits and we collect those and provide those to the program office,
     which, in fact, reports to the Commission periodically on the feedback
     that we get.
         We will, through the day, speak more to that.
         The next chart, and I won't spend any time on this, but it's
     -- you've seen it, I'm sure.  These are the four outcome measures that
     the agency has been measuring itself against and we, not unlike any
     other group in the agency, are focused on these four basic outcome
     measures and measuring our effectiveness against these, maintaining
     safety, enhancing public confidence, becoming more efficient and
     effective, and then, very importantly, in this current period, removing
     undue -- undue, a key word -- and unnecessary regulatory burden.
         The Chairman keeps saying, appropriately, we are a burden.
         DR. POWERS:  Yes.
         MR. MILLER:  And that's not the issue.  The issue is what
     can we do to eliminate the unnecessary burden.
         Just from a regional perspective -- next slide, please,
     Chris -- you will hear us talk a whole lot today about changes.  I don't
     want to be or sound melodramatic, but I don't -- let's see, I've been in
     the agency for about 25 years now.  I've never seen a period of as much
     change as now.
         TMI was a period of great change, obviously.  But we are in
     the midst of reexamining virtually every aspect of what we do.  So it
     presents an enormous challenge.  So this is -- we'll talk about the new
     oversight program as just one among many areas of change.
         DR. POWERS:  When we looked in the private sector, which
     itself is going through a period of dramatic change, I would say, in the
     period of five to ten years ago, we found that they have a lot of
     lessons learned about how you change a culture, and we're talking about
     changes in culture here at the NRC.
         And some of the private sector did it extremely well, but
     most of them, by and large, hacked it up when they first tried to
     undergo sea changes to meet a more global challenge and competition.
         And what they learned, if nothing else, was that you can't
     pile change on change upon change, so that you have the flaming duck of
     the week from management coming in as what gets changed.  It takes
     effort to change is what they learned.
         And I guess one of the issues the ACRS continually brings up
     in this period of change is are we trying to change too many things all
     at once, would it be better to focus on one area of change, assimilate
     that before we go on to the next one.
         I'd like to get your views on how you think the -- it's
     twofold.  One is, can we learn anything from the private sector about
     cultural change, and the second is when we look at that private sector,
     are we doing the right thing, are we taking lessons learned from that,
     or are we making the same mistakes that a lot of those companies that
     went through restructuring and right-sizing and every other kind of
     program, which is not different than what we're doing now.
         It may be the forerunner of things that we may indeed need
     when we're looking at a leaner, meaner, right-sized NRC in the future. 
     That's where we're headed here.
         Can we learn anything from this private sector experience?
         MR. MILLER:  One of the advantages that we have in the
     regions, I would say the agency, really, is that we're overseeing
     industries that have been undergoing sea change and I think, again,
     that's not overstating it.
         The utility industry, of course, is being revolutionized in
     terms of how it's regulating on the economic side, of course.  So we've
     been able to observe what works and what doesn't work and the agency has
     worked very hard on this and we in the region have worked very hard on
     this.
         And we can talk in the abstract, but I think what might be
     very useful is that when we talk later this afternoon about the new
     program, we will give you some detail on the initiatives that we have
     undertaken here in the region and as part of the overall agency effort
     in that one program area to effect change, starting with having a lot of
     discussion on the need for change, a good ventilation and airing of
     that, having people involved in the development of the changes
     themselves, and on.
         And much of these we have stolen, I mean, much of the
     approach we have taken is just flat-out stolen from what we have
     observed licensees doing.
         DR. POWERS:  You have quite a cross-section in your region. 
     You've certainly had a chance to discuss change that's going on at Salem
     and we've certainly had a chance to discuss change going on at
     Susquehanna.  I was struck that they give you the polls; one is theory X
     and one is theory Y, approaches.  So you get quite a range of approaches
     to learn from with the cross-section of plants that you have.
         MR. MILLER:  Yes.  And some have worked and some haven't
     worked.  And, you know, the books are bound.  For a period of time,
     about six months ago, I couldn't go to a plant and not have a CNO or a
     senior VP stick a book under my nose.
         DR. POWERS:  Yes, and say read this.
         MR. MILLER:  And the theories are not all that different. 
     It still comes down to a lot of hard work.  And I don't know how well
     we're doing.  I think we're doing okay.  But we can talk about it and
     try to gauge your reaction.
         Another point to make about Region I, and, again, I say this
     from having been in headquarters for many, many years, having been in
     Region III for about ten years, and having been here for about three
     years, there is no place like Region I when it comes to public
     involvement, and that's good news.  That's good news.  It says that
     we've got a very active and interested citizenry in the northeast.
         I have not been able to figure out why it is this way, but
     it is this way.  In Connecticut, really in virtually all of the places
     in Region I, there is a lot of public interest and it is a -- it's a
     certain -- it carries with it a lot of extra work.
         And I don't say that in a complaining way.  It's just a
     matter of fact.  It presents some real challenges.  So when we talk with
     Diane, when you talk to Diane today, we can assure you of some of the
     things that we do here to try to deal with that effectively.
         Then, lastly, in the way of significant challenges, I am
     always worried that as we may change, as we move to this new program, as
     we spend an enormous amount of effort devoted to development of the new
     program, we don't take our eye off the ball.  Plants are operating and
     so we're continuing to emphasize to our inspectors, in a way that
     doesn't, again, cause undue impact, it's our job to be very alert to the
     potential performance problems to set in.
         And it can happen, backsliding can occur, and we've seen too
     many cases of that.  Throughout this whole period of change, we must not
     take our eye off the ball.
         So that's kind of an overview of the region, what some of
     our accomplishments are and what we see as challenges.
         With that, I'm going to turn it over to -- unless there are
     more questions for me -- turn it over to Randy Blough and to Wayne
     Lanning to kind of discuss in a little more detail their divisions and
     some of the challenges that they face and a little bit about -- and
     emphasizing not just the routine things, but the things that are kind of
     unique, if you will, about how we do business, and also, importantly,
     addressing your question of consistency, how we assure that what we do
     -- every plant is unique and every -- we have a different set of
     licensees and how there is a common approach and consistent approach
     taken to the way we do business.
         MR. BLOUGH:  Thanks, Hub.  Wayne and I are just going to
     talk briefly about our organizations and then we'll launch into a
     discussion of regional programs and policy that focus on consistency.
         My name is Randy Blough.  I'm the Director of the Division
     of DRP, Division of Reactor Projects.  Jack Crlenjak and I have
     responsibility for the inspection program management and coordination,
     assessment process coordination, and implementation for all the
     operating reactor facilities in the region, except Millstone.
         The decommissioning reactors are in our Division of Nuclear
     Materials Safety.  We made that change and that enabled our division to
     focus on the operating reactors.
         We're a lot like the other regions.  We have six project
     branches.  We try to have a logical division of facility
     responsibilities amongst them.
         Most of our branch chiefs are here.  Michelle Evans is Chief
     of Branch I.  I think -- could you stand while I mention your name? 
     Michelle is our newest branch chief.
         John Rogge and Glen Meyer have Branches 2 and 3, and Glen is
     not here right now, but he will be, especially when we talk about the
     pilot process.  John and Glen have pilot plants within their branches
     and John Rogge also has the additional challenge of not only having
     pilot plants, but plants that have been of substantial interest to us
     that continue in the normal inspection program and so forth that we're
     implementing now.
         Curt Cowgill was with you yesterday and he is at Peach
     Bottom today.
         Cliff Anderson has Seabrook, Pilgrim and Vermont Yankee.  He
     is not here at the moment.
         Peter Eselgroth is the Chief of Branch 7 and he's also been
     involved in the transition task force, writing the detailed
     implementation details and the procedures for the new program that are
     now being exercised in the pilot process.
         We also have a small technical support team, not shown on
     your chart there.  It's really two members at this point.  We have to
     decide year to year whether to continue that or whether to somehow
     integrate it into the rest of the organization.  But it's been very
     helpful to us in terms of helping us with information management,
     special projects, and they also are involved in some of the self-audit
     and things like how we're tracking our inspection hours, are we keeping
     the inspection plans up-to-date, are we properly recording completion of
     program, and that sort of information management type.
         They've also been very -- working very closely with
     headquarters and the other regions and, in fact, they're at a tech
     support counterpart meeting now.
         We'll talk in more detail.  I just wanted to give you a
     thumbnail of our organization.
         Are there questions for me at this point?
         [No response.]
         MR. BLOUGH:  Wayne is going to talk about the DRS
     organization now.
         MR. LANNING:  Good morning.  Our mission is to support DRP
     and assuring that nuclear power plants are operated in a safe manner. 
     We have about 63 staff who are specialists in their assigned area of
     expertise.  This chart shows that we're arranged essentially in four
     branches, and three of those branch chiefs are here with us today and
     some of them will be talking later to you.
         We also have the senior reactor analysts reporting to me.
         As I go through the list of responsibilities, it should
     become more clear as to what these branches are doing and their
     functions.
         Our major responsibilities include the inspection and
     engineering programs and technical issues.  That includes such things as
     in-service inspection and testing, fire protection, MOVs, corrective
     actions, also.
         We've just completed the license renewal inspection at
     Calvert Cliffs.  We're also responsible for inspecting the independent
     spent fuel facilities.  We've got essentially three active sites here in
     Region I.
         Another major area of responsibility is the inspection of
     the radiation protection programs, which includes occupational
     radioactive waste and transportation, radiological employment
     monitoring, environmental monitoring and control.
         We have responsibility for the inspection of physical
     security, including support for the operational safeguards response
     evaluations.  We also do some license reviews in the area of security.
         Similarly, for emergency preparedness, we evaluate
     significant changes to the emergency preparedness plans and we also do
     assessment of emergency drills.
         Another major area of responsibility is that we have
     responsibility for the licensing and examination of reactor operators,
     and Rich Conte will talk to you later and give you an overview of the
     new process that we're undergoing, a recent change to Part 55.
         We inspect Y2K readiness at nuclear power plants and we
     completed that first round of inspections.
         Finally, we provide the risk insights on a variety of
     regional activities, and you've already heard that Tom Shedlosky is
     going to talk to you here about his activities.
         So our major challenge in the division is change.  The
     division was reorganized in January.  We reduced from five to four
     branches and we had a significant realignment of staff.  There have been
     a number of management changes, to further complicate the management of
     the division.
         I'm a new director, I've been there for six months.  Bill is
     an acting division director, we have a new branch chief, we have acting
     branch chiefs, and we're getting a permanent deputy early next month. 
     So we're continuing to change.
         The implementation and support for the new assessment
     program is certainly a challenge for us.  Finally, the planning for
     inspections and effective utilization of the staff is an important
     function and a challenge for us, and we'll talk to you in more detail
     about how we going about doing that, our initiatives for dealing with
     it.
         But with each of these challenges, we have a strategy for
     success.  I think we're on that path.
         If there are any questions for me regarding this division.
         MR. MILLER:  Just one comment on this, if I may.
         DR. POWERS:  Go ahead.
         MR. MILLER:  There has always been an issue about how you
     organize the reactor program in the regions.  What you see here really
     is essentially a classic matrix organization.  I have -- for as long as
     I've been in the field, it's vital that there be a small technical
     division, in addition to the project division.  There are a number of
     reasons.
         I mean, the senior residents and the resident inspectors are
     there day in and day out and they give you that continuity and that
     insight that only comes from being present on-site continuously to watch
     problems as they unfold.  Most technical problems don't arise and get
     solved and addressed overnight.  But to watch things over a period of
     time, and that invaluable insight comes from the resident inspectors.
         But the technical strength that comes out of the Reactor
     Safety Division is vital for us to be in the business of meaningful and
     insightful inspection, a vertical slice type of inspection, where you
     dig below the surface on some of these technical issues.
         But there is also an advantage with respect to a dialect
     that occurs and it sets up a different perspective and we, as you will
     hear us talk about in our planning processes and our so-called plant
     performance review processes, bring in that other perspective that's a
     bit different than the perspective of the individual inspectors.
         And to the issue of consistency, it's very important, I
     think, on issues like fire protection.  You were at Susquehanna
     yesterday and, I mean, in a certain respect, it's a no-brainer; I mean,
     you can't take motor-operated valves or fire protection and expect to
     get any kind of consistency unless you've got a few people who are
     really trained and can look at that issue across all of the stations.
         So there are tremendous benefits that arise from having a
     region-based technical specialist inspection.  In the past, there have
     been some discussions about putting all your people on-site, but I have
     come to feel that this arrangement here is a very sound one.
         DR. POWERS:  Well, I guess my own personal view is that this
     technical organization becomes a more and more crucial organization as
     we move to a more risk-informed regulatory process.  Then, in fact, what
     we're going to have to do is, on our managers, for the inspection force,
     with a great deal more technical information on how they direct that
     inspection force and focus that inspection force, and that's going to
     come from this kind of an organization.
         We're seeing it some way in the fire protection area, where
     the FPFIs came in and said we had a core inspection program for fire
     inspection, but it didn't have the technical depth we think we needed to
     have, and they come up with results in a different approach.
         If, in fact, we move on into the NFPA kind of regime,
     certainly this technical support that you need in order to carry out an
     inspection of what they're doing in fire protection becomes much more of
     an effort than going through and inspecting against classic
     deterministic criteria.
         So as I see it, Wayne's role in your organization is growing
     in this environment and it's not a question of whether you have it and
     it won't work -- you can't get enough people if you put them all at the
     sites.  You've got to have some sort of a matrix where they can draw
     upon them as they need them, because each plant is going to have a
     different issue in this area.
         MR. MILLER:  Dr. Powers, the record is replete with examples
     of where the specialist inspector arriving at a site sees something that
     -- you can't expect, in a sense, the resident to understand.  The
     residents have got an incredible perspective of many things that happen
     on-site, but the plants are too complex.  The issues are too complex to
     expect that they ever grasp all of them.
         Just last week, at Millstone, we had an issue arise
     regarding a recirc, one of the new systems that was installed up there,
     and I think that came out of the DRS inspector's finding.  And, again,
     we could regale you with examples, but I -- and I agree with your point,
     that not only looking backwards, but looking on a going forward basis is
     going to be critical to have this kind of an inspection capability.
         DR. POWERS:  One of the areas we're going to be particularly
     interested in, Wayne, when you discuss, especially when your SRAs
     present their program, it's going to be a question of do we have the
     tools we need to do the job we want to do and is there a need for the
     research organizations to refocus its efforts in providing a different
     set of tools in this environment than what you have.
         That question -- you can just take it as underlying
     everything we ask, that we're very, very interested in whether the
     regions have the tools they need to do the job.
         One of the things that we see continuously at the sites that
     we visit is, in a deregulated environment, the licensees are bringing in
     lots of tools to help them do their job more efficiently.  And we're
     asking, are you guys getting the kinds of support you need from the
     research organizations or whomever to give you the tools to get the job
     done the way you want to do it.
         I'm perfectly content that you can get the job done, but are
     you doing the job the way you would like to see it done, the way you'd
     like to see it done maybe not this year, but, say, five years from now.
         MR. LANNING:  I would agree and we will talk to that in
     detail.  In fact, we're going to talk about how we use some of those
     tools.  We'll discuss these activities.
         MR. MILLER:  I guess before -- and we will address that
     throughout the day.  I'm trying to keep a list of things that, as we go
     through the day, I trust that we will be able to address this question
     and your earlier question regarding change management, which I just
     touched on.
         I think today, as I understand our agenda, we will be
     addressing that.
         DR. POWERS:  Good.
         MR. MILLER:  The last thing I will mention, though, before I
     move on from this broad overview of the region and the organization, is
     that I have worked in all of the headquarters offices.  I spent many
     years in waste management, before the ACRS, and putting in place waste
     management rules within Region III and out here, and there has never --
     I've never been at a place that has a higher caliber staff than this
     region.
         I remember Joe Callan, when I first came to Region I, said
     he envied me for the quality of staff, and I've not been disappointed in
     that.
         The other thing I'll mention is we have worked hard to --
     and I've tried to bring this out a little bit in making the
     introductions here.  We've got a cross-section of people who have been
     not just in Region I, fixed in one spot, but that have been -- have
     moved around.
         Randy is an example of the system.  He was Deputy Division
     Director.  He was for years in projects, but then was the Deputy
     Division Director in Reactor Safety and went and managed the Materials
     Division for about a year and a half or two years, and now is Director
     of Projects.
         Wayne was the head of the Inspection Directorate on
     Millstone for a period of time.
         So I think we are strengthened by this kind of rotation. 
     Licensees do it and they've gained immeasurably from it.  We've tried to
     do the same thing here to avoid a parochial kind of narrow view of
     things and I think it has paid off, and it would pay off a lot as we go
     forward and continue to struggle with making change.
         DR. POWERS:  You remind me that Joe Callan not only told you
     of the technical excellence of the Region I personnel, but he also told
     the ACRS when he came just after he had come back to headquarters.  He
     told us that he was leaving an awfully good organization.
         MR. MILLER:  Well, you judge today.  I can tell you this,
     but you can judge today.
         DR. POWERS:  Well, we've been told before, so.
         DR. LARKINS:  I have a quick question.  You had a
     significant turnover in the resident program.  What about in the Reactor
     Safety Division?  Has it been pretty stable?
         MR. LANNING:  We've had a similar turnover, for some of the
     training in new inspectors, and, of course, some of those inspectors go
     on to become resident inspectors.  So they transfer over to DRP.  We
     also provide a number of residents and staff for other regions and
     headquarters, for that matter.
         It really gets back to the point about the amount of hiring
     that we've done, where those vacancies have occurred.
         MR. MILLER:  We view that somewhat as a fact of life.  There
     is a huge domino effect, of course, associated with reactor -- with
     resident inspector vocations.
         That has slowed down some.  It went from five to seven
     years.  It has changed a bit, but still, you know, in a region, you're
     always faced with having to fill positions, much of it -- much of the
     feed, if you will, or the reservoir of talent comes from DRS.
         There is less turnover, though, over the long run.  Because
     you've got the senior people in DRS, it tends to be a more stable
     organization than with DRP, and that's not surprising.  It's a
     region-based job and there isn't a driver that there is on the resident
     side.
         But we'll talk a bit more about what we face with respect to
     staffing when Jim Joyner comes on.  It's one of the later presentations
     this morning.
         DR. POWERS:  As long as you're writing your list, we talked
     earlier that we would be very interested in understanding what the
     perspectives are on the impacts of the new cornerstone inspection
     program as far as inspector hours in the plant versus inspectors
     spending time working paperwork, administrative burden on inspectors.
         It may be too early to comment, but speculation is welcome.
         MR. MILLER:  We, in fact, have that as one of the issues we
     can talk about.
         DR. POWERS:  Good.
         MR. MILLER:  With that, Randy?
         MR. BLOUGH:  We will talk some about the admin burden on
     inspectors.  But part of what the process is going to be in this pilot
     program is to sort out what's a startup cost, which is going to be --
         DR. POWERS:  Sure.
         MR. BLOUGH:  That's part of it.
         DR. POWERS:  I mean, one of the questions we have is whether
     the six-month pilot is long enough or should we, in fact, be building a
     pilot basis for an entire cycle or some other measure of time, because
     to get the kind of information that we're looking for, one of the
     questions is what are the kinds of information that we're looking for,
     and to the extent you can tell me what you think information that we
     should be looking for from the pilot is, that would be very useful to
     get that insight, because, quite frankly, we're not getting it out of
     the program office itself.
         MR. BLOUGH:  Okay.  And we are aware of your June 10 letter
     commenting on the program and what we hear from NRR is they're working
     on the response.  They didn't tell me when it would be, but I think --
     so we won't be answering the questions, but you'll be getting insight
     from the interaction with us and we'll --
         DR. POWERS:  It's an insight sort of thing.
         MR. BLOUGH:  The way we're approaching and answer every
     question when we get there.
         DR. POWERS:  Okay.
         MR. BLOUGH:  I'm going to launch now into a discussion of
     regional programs and policies, and it will focus on efforts to assure
     consistency among the regions and consistency with the program policy.
         Now, if you want to call for a break at any time, go ahead.
         DR. POWERS:  I think my chief of protocol tells me that I
     have to keep on the existing schedule, so I think I will call a break at
     this point.  One of those rules of running the meeting.  So we will
     break for 25 minutes, I think.
         MR. SINGH:  No, we can make it 15.
         DR. POWERS:  I can do it for 15?
         MR. SINGH:  Yes.
         DR. POWERS:  Then I call for a break for 15 minutes.
         [Recess.]
         DR. POWERS:  Let's come back into session.  Here, we are up
     to the presentation by Randy Blough.  So we'll let you pick up where you
     left off.
         MR. BLOUGH:  Okay.  Thank you.  As I mentioned, we're going
     to discuss regional programs and policies, with an emphasis on measures
     to ensure consistency with program guidance across the NRC.
         In addition to the items mentioned on the slide there, we'll
     go right into a discussion of those items of operator licensing under
     Part 55, which is responsive to your agenda.
         As Hub already mentioned, we are in a time of substantial
     change in the agency.  So as we talk through each of these areas, we'll
     really be talking about -- there will be some elements which we already
     made recent changes from what we had done previous to that, areas where
     we have other changes in progress, some areas where we have interim
     processes, an example of that being the SALP process being suspended and
     we've got an enhanced PPR process that we're using in the interim,
     before the expected full implementation of the new oversight and
     assessment process.
         Also, there are elements where what we've done in the past
     will largely continue on into the future, with just minor improvements.
         So we'll try to make those clear as we go along, which we're
     talking to, which is undergoing change, which is a future change that
     we're working towards.  But, please, help us as you need to get it clear
     from us.
         Next slide, please.
         This slide really just lists the areas where regional
     consistency is desired and really it boils down to most things we do. 
     As a regulator, consistency is part of the job and it's important.
         MR. MILLER:  This is one of the biggest issues that
     licensees have, is the perception that region to region and is different
     and then within a region, that there's not consistency.  This is --
     they're looking for us to be consistent on these things.
         MR. BLOUGH:  And endeavoring to do so adds complexity to
     what we do.  But it's important to us.
         Under the areas we mention on the slide there, when I list
     performance indicators, on this slide, I'm talking really about
     indicators of how NRC is doing and, of course, we have our operating
     plan metrics that flow really from the strategic plan to the agency's
     performance plan to the regional operating plan and then to our metrics
     that we keep track of, and we have some consistency through that,
     through the program -- working with the program office to come up with a
     regional operating plan and metrics.
         In the area of continued program development, we in the
     region just think it's vitally important that field experience be used
     and field insights, because of benefit, be factored in fully to all
     these program development activities that are going on.  So we consider
     it important to be involved in all phases of program development.
         I should say that this is a good point to mention that there
     are great parallels between the reactor program and the materials
     program, that we're not talking about today.  There's equal amount and
     nature of change ongoing there and the region has, for a number of
     years, been involved in basically all of that change.
     DR. POWERS:  I think in the coming years, we'll see some fairly dramatic
     changes taking place in the materials and the licensees.  Dr. Kress is
     our representative on a joint committee with the ACNW and the
     development of risk-informed regulation of NMSS activities and, I don't
     know, do you have any insights on where that train is now?  Is it still
     in the station or is it leaving the station?
         DR. KRESS:  Just leaving the station.
         DR. POWERS:  Just leaving the station.
         DR. KRESS:  It doesn't know where it's going yet.
         DR. POWERS:  I see.
         MR. MILLER:  In fact, that's an area where I believe we have
     a person on the staff who is a good example of where a person from our
     staff is working with headquarters people on the definition of that
     project.
         DR. POWERS:  That will become a -- once you've assimilated
     all the change that's come through in the recent years, there is another
     bundle of it heading down your way.
         DR. KRESS:  This is one of the significant differences with
     reactors.  There is a different definition of risk and who is at risk
     and what tools you have to deal with those things.  There is an
     incredible variety.  So it's much more complex.
         MR. BLOUGH:  On the next slide, I kind of talk about what it
     takes to continuously try to achieve consistency, and basically it boils
     down to having good guidance and then adding onto that a whole lot of
     communication and coordination.
         This slide mentions program guidance and detailed inspection
     procedures, and I've already mentioned that there are a lot of other
     areas of program guidance, including our strategic plan and our
     operating plan.
         We have detailed inspection procedures and detailed program
     guidance and they're undergoing change in lots of cases and they require
     a lot of work to make sure that it's good guidance and to -- we need to
     actually get better in the future of continuously improving that
     guidance as we gain experience, so that we don't create large deltas in
     the future between where we could be and where we are in terms of
     program guidance.
         I've listed some areas of inter-regional communications. 
     Certainly, it goes on, to a degree, at all levels.  I have them listed
     on the slide.  The EDO has his weekly staff meeting that involves all
     the regions, as well.  There is a senior management meeting, that's a
     very important senior level coordination, regional administrators meet
     amongst themselves and with NRR in conjunction at other times besides
     the senior management meeting.
         At the division director level, we have frequent
     communications and counterpart meetings at the division director level
     and we have also other areas of inter-regional communication going on,
     primarily in specialty areas.
         All these areas of communication are being enhanced during
     the new program development and the pilot.  We'll talk later about the
     transition task force weekly conference calls that occur.
         The pilot program conferences that will be held monthly by
     teleconference, and will involve all the resident sites and regions
     involved in the pilot.  During the pilot, we'll also have division
     director counterpart meetings, where we're planning those about monthly
     and those will begin July 16.  Enhanced level of communications during
     the pilot.
         DR. POWERS:  I am very interested on this entry under
     inter-regional communications called peer level, where you have SRI and
     resident inspectors.  What do you have for residents from different
     regions getting together and comparing notes, I guess is the question
     I'm asking.
         I come from a background in the DOE reactor communities
     where we found an especially effective program for operating aluminum
     fueled reactors was simply to get people together at every level of the
     organization, not just the top guys, but the guys that did the
     maintenance, guys that operated the plants, to compare notes on what
     they were doing on a regular basis.
         It brought everybody up and it became major competition to
     see who could find the best ways to do things and share it with their
     peers.
         I wonder, do you have that sort of thing for your residents?
         MR. BLOUGH:  Primarily, at the resident level, the
     inter-regional coordination that goes directly that way is networking
     type coordination.  Our counterpart meetings that we hold with the
     residents are basically region-focused.  So we bring in all our
     inspectors, with some knowledge of what's going on in the other regions.
         So we do tend to use really the management organization more
     for the inter-regional coordination than the staff level.
         DR. POWERS:  So for the residents, you're really looking at
     intra-regional uniformity and depending on the management or the
     inter-regional uniformity.
         MR. BLOUGH:  The peer level between regions tends to be more
     just the individual networking or the inspector and their supervisor
     watching what's going on in the agency and contacting the other regions.
         We have -- not in the recent past, we have, in the more
     distant past, looked at and done, but very infrequently, a broader
     counterpart meeting of the residents.
         MR. MILLER:  Dr. Powers, there was one, I think, about two
     years ago, an all-agency resident inspector meeting.  We made some
     judgments about what kind of value was there in that.
         I mean, you can talk about a number of things being done. 
     Much of our struggle is getting consistency.  There are two things you
     can talk about being done at this meetings.
         First of all, you can talk about here is the guidance, here
     is what our expectations are, have discussion/interaction on that.  The
     second thing is a little bit along the lines of what you're talking
     about, what does it take to get to come up with an insightful finding
     and what techniques do you use within the broad guidance to come to
     meaningful insights, and that's the kind of inventive part that you
     might say getting people together helps on.
         I think there is an enormous amount that comes out of our
     resident inspector seminars, of which we have two a year, at a region
     level.  And I'll be honest with you, I don't think that the benefits
     have gone to all four regions at the same time, on that second point, in
     terms of people sharing ideas or sort of unique ways to come to an
     insight, really warrants those all-agency meetings.
         That's an opinion, of course, but much of our focus, again,
     as I say, in these sessions, is to just get the basic information out to
     the people so that they understand, hearing, all at the same time, the
     same things, and we make great effort to bring into those meetings Sam
     Collins and other people who are responsible for the programs, so that
     what Region I is hearing is the same as Region III and so on.
         MR. BLOUGH:  That's a good point.  When we do the regional
     seminars, we're sure to have program office participation in those. 
     It's very consistent from one to the other.
         MR. MILLER:  Just one other thing that I'll mention that
     I've seen, and, Randy, if you've seen it, as well.  I think that -- I've
     been impressed with how much my Calvert Cliffs inspector, for example,
     who is out there on a CE plant, talks to his counterparts at Palo Verde
     and Palisades, and how much my TMI inspector spends talking --
     DR. POWERS:  To Davis Bessy.
         MR. MILLER:  -- to Davis Bessy and with Rancho Seco.  I
     mean, you see a lot of that.
         DR. POWERS:  Interesting.
         MR. BLOUGH:  Now, for the development of the new program and
     work on the pilot, it's different and it is more an agency-wide
     coordination.  For development of -- for program development, in order
     to be able to watch the plants, the only way we've been able to do this
     is really put people on program development full-time.
         So in that case, inspectors and supervisors came together
     from all the region into task groups and part of the region person's job
     would be to use his or her experience and also network with the region,
     as necessary, to support what the agency has done.
         I've already mentioned that during the -- well, I haven't
     mentioned the training.  The training for the people to be involved in
     the pilot has been the inter-regional training, and the conference calls
     that we'll do involving the resident sites during the pilot involve the
     sites.  But, of course, it's a more manageable group, because we're
     talking eight sites, nine facilities.
         DR. POWERS:  Sure.
         MR. LANNING:  If I might jump in here, Randy.  We also get
     good cross-fertilization among the regions in our exchange of inspectors
     on team inspections, for example.  That brings different perspectives.
         DR. POWERS:  The networks get created that you're talking
     about.
         MR. BLOUGH:  You mentioned that there's some duality in
     these slides, whether you call a certain aspect inter-regional
     communications or inter-divisional, and that's the same with my next
     slide, if you'll bring that up.  There is some overlap between what you
     call inter-divisional or agency-wide communication.
         In the region here, we do a lot to coordinate amongst
     ourselves with the program office and Wayne and I have a morning reactor
     oversight coordination meeting, that's our 8:00 a.m. meeting, that
     involves the NRR on the phone, as well, and the EDO office is usually
     represented.
         We've been working to improve that meeting.  In fact, we
     just revamped the way we do it a little bit, and we have instituted in
     that, along the lines of continuous learning and lessons learned, we've
     instituted a weekly critique now of how we're doing with that meeting
     and that sort of thing.
         We've actually just had the first weekly critique this
     morning.
         DR. POWERS:  What did you conclude?
         MR. BLOUGH:  Well, mostly what we concluded is we just
     started with the improved guidance on Monday, so some of the things that
     are in there we're not doing yet.  So the critique focuses on where
     we're trying to get with this new -- you know, relatively little change,
     but hopefully important, that we made to our morning meeting.
         DR. POWERS:  Let me encourage you, in your critique, not to
     be -- not to hesitate to pat yourself in the back, too.  You don't want
     to forget the good things you're doing.
         MR. MILLER:  But that's an example of an earlier question
     you asked about self-assessment critique, that's the kind of approach
     that we try to take.
         MR. BLOUGH:  And we have had a fair amount of rotation
     amongst the divisions and also we use regional staff rotation to
     headquarters and back as a way of keeping tied in with -- getting really
     tied in with where the agency is at a certain point in time and bringing
     back other perspectives.
         We've already mentioned the regional inspector seminars, to
     some extent.  These are very important to us and these involve all the
     inspectors in the region.  And at times, we'll have separate sessions
     for materials and reactor folks.  At times, we have plenary sessions, as
     well, and they involve breakout groups and these are -- the staff has
     very much been involved in helping with the agenda and the planning for
     these, but they are also a good chance to bring in senior management
     guidance and to reiterate the messages that we're sending to our staff.
         DR. POWERS:  When do you hold your regional inspector
     seminars?
         MR. BLOUGH:  The last one was early May and it's usually
     late spring and in the fall.
         MR. JOYNER:  The next one is November 30, December 1 and 2.
         MR. BLOUGH:  They typically have been about three days.
         We're not usually involved in industry workshops, but we do
     a fair amount of that and not only participation, but presentations by
     NRC managers, as well.  It's a way of just staying connected.
         This last bullet is also inter-regional or agency-wide
     element of coordination.  The specialists in most cases have periodic
     conference calls that involve all regions and headquarters and they're
     pretty good.  They have agendas and anyone can bring an issue forward
     and they try to decide what the right approach is and if it needs study,
     it gets staff help with that.  If it's something that everyone agrees
     on, that it can be handled there.
         And these -- recently, the NRR Inspection Program Office is
     also planning to be involved in these conference calls, in addition to
     the technical staff, because the focus on these calls is tending to be
     focused on new program development, and especially the pilot.
         Other items not mentioned on the slide that really tend to
     keep us working together is our assessment process and our inspection
     planning process that Wayne and Jack will be talking about.  In Region
     I, we use integrated inspection reports, for the most part here.  At a
     given site, for a period of time, all the inspection effort, resident
     and specialist, will be in one integrated report and that's a
     significant element of regional cooperation, because that's kind of
     developing our product.
         Other things you'll hear about are the allegation panels and
     the enforcement panels, which are not only inter-divisional, but involve
     the Regional Administrator staff and extensive involvement from
     headquarters.  So these are inter-regional, as well as inter-divisional
     methods of trying to cooperate, communicate and stay on the same page,
     to be consistent.
         So all this is a lot of work.
         MR. SINGH:  In your seminars, do you include also the
     regional inspectors from DRS or just the DRP resident inspectors?
         MR. BLOUGH:  It's the whole region.
         MR. SINGH:  The whole region.
         MR. BLOUGH:  And we may take an opportunity during the
     seminar to have branch meetings in DRP and then DRS will do the same
     thing, but it's a matter of convenience of having everyone there at the
     same time.
         We'll have breakout groups on specific topics and typically
     when we do those, those will be DRS and DRP inspectors together.  And
     most of the reactor agenda, virtually all of the rest of the reactor
     agenda is joint DRP and DRS, because it's really -- it's so close, the
     jobs are so closely linked, it's just the way it goes.
         What we do is we do separate out the materials inspectors
     for their own portion of the agenda, but even in some cases, when we're
     talking about where the region is going or --
         DR. POWERS:  Two years ago, we made this --
         MR. SINGH:  That's the reason I asked that question.
         DR. POWERS:  It used to be just the resident inspectors and
     this was just the wrong approach.
         MR. SINGH:  Right.  Because when I was an inspector in
     Region IV, and Sam Collins started the regional inspectors DRS
     counterparts meeting.  And then also, as you know, they have always a
     resident inspectors counterpart meeting.
         DR. POWERS:  And I think the point that Randy made about
     integrated reports is also an important one.  It used to be that we
     would write individual reports and we left it up to the licensee to
     integrate them.  It was a little bit almost for internal convenience, it
     seemed.
         It's a much more difficult thing for us to do it this way. 
     It takes a lot of effort and we've been trying to broker among them the
     different views and the like.  But we've come up with a product that has
     a better overall perspective.
         MR. BLOUGH:  And there is a continuing mind set you need to
     be in to make sure you're trying to integrate.  One of the issues we're
     dealing now with in the new program is where we train all the inspectors
     at once, the regional and the resident inspectors, and, of course, we're
     trying to push to make sure that we do it that way when we go to the
     full-scale training of everything, that it's all done together, because
     of the synergy you get.
         I'm spending way too much time here, I think.  The next --
         DR. POWERS:  This is a relatively important issue.
         MR. BLOUGH:  The next slide is really an over-simplified
     flowchart of the inspection and enforcement process.  It's really
     written for the old program, but it's so generic, it almost applies to
     the new program.  It doesn't show where the PIs come in and, of course,
     with the new program, as well, enforcement won't be on its own track. 
     We're endeavoring to make it closely integrated with the assessment
     process.  So they would be halves of the same block, really.
         So the elements of how we do each of these would change with
     the new program, but this is what we do in inspection assessment.  We're
     going to talk about all the elements, except we don't have agenda items
     specifically on documentation at this point, but we can answer questions
     about that, and it comes up again this afternoon when we talk about the
     new program.
         So with this simple flowchart, we'll start to get into the
     complexities of it.  Wayne Lanning talks about planning and inspection
     and then he'll show you kind of a flowchart, I think he'll show you a
     flowchart that breaks down just the planning block to start to show some
     of the complexity involved.
         Are there any other questions for me before I turn it over
     to Wayne to talk about planning and inspection?
         DR. POWERS:  I don't see anymore questions.  I'm going to
     question Wayne.  I just can't imagine a more difficult job than planning
     and inspection.  I find that when I do site inspections and whatnot, and
     I try the diligent plant, it's impossible.
         So if you can give me some insights on how to do my own
     planning, I'd sure appreciate it.
         MR. LANNING:  That's a great introduction.  It's certainly
     fundamental to what we do.
         DR. POWERS:  Yes.
         MR. LANNING:  And it's fundamental for ensuring success.
         DR. POWERS:  You're not going to find what you're looking
     for if you don't know what you're looking for.  I mean, I just can't
     imagine you being able to stumble across things by accident.
         MR. LANNING:  Right.  But inspection planning remains a
     challenge to us, because of this complexity, not to mention the demands
     that are placed on us to schedule inspections.
         This is certainly one area that the regions do differently. 
     I want to spend just a little time to give you an appreciation of the
     complexity of inspection planning by discussing some of the numerous
     parameters that affect what we inspect and where we inspect and when we
     do it.
         There are 17 sites with 26 operating units in this region. 
     The performance and issues are different for each site and they're even
     different sometimes between units at the same site.
         DR. POWERS:  A very different world.
         MR. LANNING:  Millstone and Hope Creek are examples of that.
         MR. MILLER:  And Nine Mile.
         DR. POWERS:  Nine Mile.
         MR. LANNING:  And Nine Mile is a really good example. 
     Inspections and the allocation of inspection resources are based on
     licensee performance and performance changes, which changes our strategy
     for maintaining reactor safety.
         Some inspections must be completed during outages and
     outages change without prior notice.  Even after identifying an
     inspection area, there can be demands and constraints regarding
     inspectors.  Inspectors are generally specialists, which impacts our
     ability to inspect the same issue at the sites.
         To further complicate the picture, operational events and
     other emerging issues result in unplanned reactive inspections and
     reactive inspections are the enemy of good inspection plans.
         Allegation follow-up is integrated into our ongoing
     inspections.  This region has historically received a large number of
     allegations and so it's not known, during the inspection planning
     process, the area or the significance of those at the time.
         Other on-site activities affect the licensee's ability to
     support our inspections.  So, therefore, we have to cognitive of INPO
     and other organizational influences and impacts on licensees.
         Of course, when NRR identifies a specific area for
     inspection, like Y2K or 50.54(f) inspections, or request support, as
     they did in support of the new inspection process, inspecting planning
     becomes unplanned.  So we have to go back.
         Since we have limited resources, effective utilization of
     our staff is necessary to accomplish our mission, which dictates that we
     plan, adjust, plan, and readjust based on these parameters that
     influence what we do.
         We have initiated an effort to develop an informal --
         MR. MILLER:  Let me make one additional comment.  We have a
     policy of notifying the licensee how many months in advance.
         MR. LANNING:  Thirty days for a team.
         MR. MILLER:  Thirty days for a normal inspection.
         MR. MARR:  Six months.
         MR. MILLER:  Six months.  We've made a commitment to tell
     licensees we're going to -- so with all of what Wayne talked about,
     there is still this obligation for us to give advance notice.  So that
     is also a significant issue; that when there is change, the
     ramifications of it are significant with respect to that need to notify.
         I'm sorry.  Go ahead.
         MR. LANNING:  We've initiated an effort of a formal
     inspection planning process, with a goal of pulling the utilization of
     our staff.  We'll go to the next slide, but we'll come back to this
     slide in a few minutes.
         Could I have the next slide?
         This slide illustrates the process that we're currently
     using to plan inspections.  Now, this is going to change when we do the
     new assessment process.  So we're sort of in transition here.
         But nevertheless, this has worked pretty well for us so far. 
     So I don't want to go into a lot of detail on this chart, but I do want
     to hit some of the important blocks in this flowchart.
         Starting at the top, with the strategy, DRP initiated an
     effort to develop an inspection strategy for each site based on site
     performance.  These strategies have several applications, but I'm going
     to just limit it to the fact that these strategies provide input into
     the inspection planning process for a particular site.
         DR. POWERS:  Can you give us an example of what a strategy
     sounds like?
         MR. LANNING:  A strategy sounds like a theme that address
     performance at a particular site.  For example, maybe this corrective
     actions.  There's a content of how we go about addressing that in our
     inspection process and we plan to follow-up on corrective actions, for
     example.
         DR. POWER:  When I read the PPR letters, do I get an idea of
     what the strategy is going to have?
         MR. LANNING:  You should be pretty close to what the
     strategy should be.
         DR. POWERS:  Okay.  I really admired the PPR letter on
     Susquehanna.  It gave me a good insight on that plant.
         MR. MILLER:  When you read the PPR letter, you will see
     areas of emphasis.
         DR. POWERS:  Right.
         MR. MILLER:  We try to be real clear with our licensees,
     that even within the core program -- in other words, it's not a
     straightjacket.  Within an area, there's a lot of judgment that still
     has to be used with respect to what you're going to emphasize and you
     try to identify those things.  That's what you see in the strategies
     that Randy and his people are working with DRS to develop for each site.
         So there should be a very close connection between the PPR
     letters and our strategies, map one-to-one.
         DR. POWERS:  Good.
         DR. UHRIG:  This is primarily for the routine inspections. 
     Is it not a response to some development, Millstone being a classical
     example, where you had problems?
         MR. MILLER:  Yes.  This is at the level of plant-by-plant.
     Another example would be that there has been a pattern of personnel
     errors at the site.  So we, in our inspections, are going to be
     emphasizing that.
         This is partly a tool to get us all on the same page within
     the region.  The resident inspectors often will know this, but how about
     the people who are visiting?  Jit was a region-based inspector and he
     knows how difficult it is to come cold to a site.  So it's sort of
     giving a heads-up to everybody, that here's what is shown at the site.
         So we're talking about issues not of the sort that -- big
     picture, Millstone was allegations, but fairly specific issues.
         DR. UHRIG:  Do you inspect all units at the site at the same
     inspection?  For instance, if there is -- at Millstone, you had three
     varieties of reactors there.  Do you inspect all three facilities on a
     given inspection or do you do those individually when they're different
     types of reactors?
         MR. MILLER:  Well, for some issues, and Randy can
     contribute, but some issues are cross-cutting.  For those things that
     are common, yes.  But some things are not.
         MR. BLOUGH:  The answer is that really it depends.  Much of
     what we do is an inspection on a site-wide basis, but if there are --
         DR. UHRIG:  If there are identical units, then there is no
     problem.
         MR. BLOUGH:  But we look at outages and we look at
     modifications and we also look at the design differences.  Of course,
     Millstone 2 and 3 are each getting their own inspection program.  So
     they're the example where they're each getting their own inspection
     program and then you've got --
         DR. UHRIG:  But common management.
         MR. BLOUGH:  Pardon?
         DR. UHRIG:  Common management.
         MR. BLOUGH:  Right.
         DR. UHRIG:  Common overall operation.
         MR. BLOUGH:  Right.  Right.  So it depends and it varies,
     depending on what inspection you're talking about.  There are
     differences amongst the sites and the programs.
         DR. UHRIG:  I noticed that in the different branches here,
     with one exception, they have both PWRs and BWRs that they're assigned
     to.  Is this just sort of the -- so that each branch gets experience in
     both areas or is there some reason for this?
         MR. BLOUGH:  Right.  We have tried a number of different
     ways of breaking it out.  We try to have, to the extent we can now, the
     same licensee within the same branch.  That's more important to us than
     PWR versus BWR.  Years ago, we tried P and B and at least in the
     projects end of things, it seemed that working on the licensee, working
     more along licensee organization was more helpful to us.
         Then, of course, there is the workload issue, too.  So, in
     fact, we're struggling with that right now, because plant performance
     changes and everything changes.
         So how they should be lined up is a continuing issue.
         DR. UHRIG:  I was just curious.
         MR. MILLER:  Corbin McNeill, and this is an interesting
     point, recently, when Commissioner Merrifield and I were visiting Peach
     Bottom, we were quizzing -- Corbin McNeill is the Chairman of PECO and
     they're obviously very active participants in this business of growing
     nuclear organizations.
         His feeling was that when we quizzed him on how he would
     group the plants, that they acquire more plants, and his feeling was
     that the technology differences are far less important than the
     management part of it.  And so for example, they're talking about
     plumbing pods, if you will, and I don't want to speak for PECO, but I
     think it's pretty clear that Peach Bottom, Limerick and TMI would form a
     pod.  You've got two boilers and a PWR.
         His sense was that the processes need to be common, the
     management approaches need to be common, and that that's far more
     important than are the differences, and I think we share that same view.
         DR. POWERS:  It's one of things that, in the probabilistic
     world, that we find ourselves in a lot, that we're really wrestling
     with, because we have a feeling that you're right, that the management
     processes may well be more important in the safety of a plant than are
     the hardware, or at least have a rule that ought to be reflected in
     assessing the risk.
         But we don't know how to do it right now and we don't even
     know whether we should be doing it.  But it's one that we just
     continuously get insights from people outside the PRA community telling
     us that this is -- management systems, management safety culture that
     gets created will have a bigger bearing on your results than whether the
     pipe is intact or not.
         MR. MILLER:  There is a lot of experience on this side of
     the table.  Does anyone else see it the same way?  Speak up.  This is an
     important issue here.
         Management is far more significant and important ultimately,
     within certain bounds.
         DR. POWERS:  Within certain bounds.
         MR. MILLER:  Than the technology.
         DR. POWERS:  It's one that we're really wrestling with and
     how we're guiding the tool that we're using to guide regulation, which
     is the risk assessment technologies.  Right now, you do not have an
     element coming into your risk assessment that says this management has a
     great safety culture.  There is no way to give them credit for that in
     the risk assessment and we're struggling with why isn't there.
         MR. BLOUGH:  We can comment on that.  The corrective action
     programs, of course, are at the heart of what we're looking at in our
     inspections today and certainly in the future, and the corrective action
     process really is collectively all of those processes that a licensee
     has for managing a station.
         DR. POWERS:  There is no question that in the new approach
     that we're pilot testing now that corrective action is one of the most
     important elements of a site's program and that's why I admired your PPR
     letter, because it put a particular emphasis on the corrective action
     programs and how they were doing.
         And the corrective action programs have -- it's not a
     program.  It has multiple elements in it and there's lots of subtleties
     in that and it's very complicated to -- you can't -- you find corrective
     action programs that are real strong in one area and real weak in the
     other, and now they can't be weak in the other.
         It's really interesting.
         MR. LANNING:  If I can pick up and continue at the top of
     the chart here.  When we integrate the strategies with the core program,
     the core program is that set of inspections that we do at each site.  We
     come up with a proposed inspection plan and we try to incorporate as
     many of the variables as we can in developing this initial inspection
     plan.
         We then held a plant performance review meeting, and I think
     Jack is going to talk to you in more detail about how that is done and
     the outcome from that, but as a result of that performance assessment
     that takes place during that meeting, we modify the inspection plan.
         Then we resource-load the inspection plan with inspectors
     and we use the reactor planning system, the RPS system for doing that. 
     That is becoming a very valuable tool for us in inspection planning. 
     It's a significant improvement over the old system we had, the MIPS,
     master inspection planning system.
         For example, this system can include inspectors' vacation
     time and training time, such that a branch chief can look at an
     individual inspector and see what the impact of redirecting him to some
     other assignment, what that will have on his ability to reschedule that
     inspection.
         DR. LARKINS:  Is that automatically handled or it something
     you still have to do?
         MR. LANNING:  We still have to do it manually.  This is a
     good scheduling tool.
         DR. POWERS:  I promised to ask this question, and I'm going
     to start fulfilling my promises.  Is it good enough of a tool or would
     you like a better tool?
         MR. LANNING:  We can always wish for something better,
     different ways of cutting the data, different charts, different reports,
     but I think currently we're gaining more experience with it, we're
     really utilizing it in our day-to-day activities.  For example, every
     week, we discuss as to inspection planning and what's on tap this week.
         So I think it's coming into its own and I think it's a
     pretty good tool so far.
         DR. POWERS:  Understand, my question is not today, but in
     2004, is it going to be a good tool or should we have a better tool?
         MR. LANNING:  See, that's a tough one, because of the new
     assessment process.
         DR. POWERS:  You don't know, that's right.
         MR. LANNING:  I don't know.  I can guess, but I don't know
     yet.
         MR. BLOUGH:  One of the things we need to consider with this
     whole process is are we going to continue to tweak on it or are we going
     to do something a little more revolutionary, like take a lesson from
     licensees or even NRR now and go to a work control center concept.
         You mentioned at the start the pace of change and keeping
     change on top of change.  These are -- I guess I'm just saying these are
     things we need to consider.
         DR. POWERS:  Sure.
         MR. BLOUGH:  And maybe there is a revolutionary way.
         MR. MILLER:  There has been a real -- it's a fetish of mine
     and it's easy to talk about the concept, but it's very hard to do.  I
     think that there is almost a perfect analogy, and we talked a lot about
     this internally, and I'm always agitating more and more on this and
     saying it's easy to sit back and say conceptually this is what this
     process ought to do, but I think it's an exact analogy of licensees who
     are dealing with a lot of corrective maintenance, involving numerous
     different groups, mechanical engineering, electrical engineering, having
     to have scaffolders, having QC support, multiple jobs of different scope
     and reactor work on top of it.
         And this is the one area -- if you look at the improvement
     in the industry, more than anything else, improvement in the industry
     has resulted from better planning.  We're trying to steal those concepts
     and apply them here.  I would say we're just in the early stages and
     it's hard for us to comment on whether this tool is good enough or not.
         DR. POWERS:  I applaud you for stealing that, because I
     think there is an exact analogy, just as you point out, and this
     resource management issue is -- I'm not applying for your job, Wayne,
     don't worry about me.  It's a tough job.
         MR. BLOUGH:  We could do it together.  It is a tough task.
         MR. LANNING:  In spite of that, we develop detailed
     inspection schedules, we get an RPS and then what comes along is
     reactive inspections.
         And reactive inspection is going to take you up the
     right-hand side of that chart.  The things that comprise reactive
     inspections.  Operational events certainly do, emerging issues do.  As a
     result of our morning meetings and plant operating experience, we do
     apply a risk-informed methodology through deciding which of those to
     include in our inspection program, what actions are appropriate.
         Of course, I've already talked about allegations.  But all
     those things come in there as reactive inspections, which impact the
     detailed inspection schedule that we just got put into the RPS.
         So weekly, we review the inspection schedule and what
     inspections are planned for the sites, sort of a way of accommodating
     reactive inspections and knowing that some of those inspections that we
     had planned are going to be deferred and some are going to be completed.
         But once we get through that process, that feeds back into
     updating the strategies for each site.  We do this loop every six
     months.  We update the strategies quarterly.  So at any given time, we
     have a pretty good understanding of what the inspection plan is for a
     particular site.
         Now, we're still improving this process.  I'm going to go
     back to the last slide, please.
         One of the things that we're doing is benchmarking our
     process with the industry practices, and Larry, the branch chief, is
     going to spend the whole day at Peach Bottom, just trying to
     understanding their work control process and work planning.
         So one of the options we're considering, as Randy indicated,
     is a work control center.  Licensees generally have a pretty effective
     process for planning and doing work.  So we're really evaluating to see
     if there are some benefits that we can use based on their system.
         DR. POWERS:  My impression, from this visit to Susquehanna,
     is they continue to struggle with this particular item as well.  I mean,
     this is a big item for them.
         MR. LANNING:  I think that's true for most plants, but some
     just improve better than others.  But it's something they have to do,
     also.
         MR. MILLER:  You mentioned the corrective action of the
     plants having many elements.  There is no more important element than
     this element.  This is on the -- you talk about identifying problems and
     root cause.
         There is the corrective action part, which is getting work
     done.  That's why I say I don't think it's overstating it to say that
     some of the major reasons why there's been improvement in this industry
     is better performance in this arena.
         We, as an agency, have lagged behind.
         DR. POWERS:  Yes.
         MR. MILLER:  And we, this region, I think, are leaders on
     this.  I think we're trying to come to it now.
         DR. LARKINS:  Let me ask sort of a follow-on question. 
     Yesterday, we heard about a lot of increases in the amount of
     maintenance being performed on-line and planning for this.
         Do you fold any of this into RPS or is this something that
     you would expect a resident to pick up on in terms of looking --
         MR. LANNING:  Well, the resident is the first position to
     pick up on that, and, of course, I'm sure you recognize that different
     licensees handle that differently, and you know probably that that's one
     of the forthcoming changes to the maintenance rule, is how licensees
     address those maintenance or LCO type maintenance.
         So really the residents are closest to it.  It's really
     within their scope to follow-up on and accomplish enough.
         DR. LARKINS:  One of the issues the Commission is struggling
     with is looking at configuration control for maintenance and assessment
     of the risk of the various configurations, and this is something that
     you're going to be picking up as part of the inspection program or is it
     just part of the normal routine for the resident or the SRI?
         MR. LANNING:  Well, we pick up some of it in terms of
     planning.  But I think generally speaking, the resident is the critical
     point for following those and they'll consult with the SRAs as needed or
     the licensee's own risk monitor process, depending on what they have in
     the IPE, that sort of thing.
         MR. BLOUGH:  And we do find -- we did find issues in that
     regard.  We're in the pilot program at Hope Creek and one of the things
     the inspector found at Hope Creek is they were, recently, sine we
     started the pilot, is they were doing some on-line maintenance and it
     was -- service water is fairly high risk and they had in their process
     these barriers that they would put in place to having anything else
     done.
         I guess the branch chief is not here.  And they didn't
     actually violate the barriers, but they didn't really formally put the
     barriers in place either.  So that's the types of things that sometimes
     the residents come up with.
         So it's really -- and, of course, this case, as the pilot
     program showed, gets us into a whole bunch of interesting questions
     about how a finding like this fits into the pilot program.
         DR. LARKINS:  Exactly.  There are new issues.  The people at
     Susquehanna were talking about doing, potentially doing some
     risk-significant maintenance for a short period of time, as long as it
     averages out to be -- not to increase the total risk or the annualized
     risk for that plant over a year.
         I didn't know if this was something you were looking at,
     risk-significant maintenance.
         MR. MILLER:  Every licensee out there has a risk monitor, my
     understanding.  Some are more robust than others.  But there is no one
     that I know of that is not monitoring risk explicitly if they plan to do
     maintenance.
         We've found problems and we've seen instances where they've
     not gotten into serious trouble, but where they've deviated a little bit
     here and there.  So we're kind of watching them.  But all of them have a
     program.
         DR. LARKINS:  This sort of goes back to Dana's question
     about having the tools also to look at what the licensee is doing and to
     be able to do your own independent assessment.  As they go through these
     on-line maintenance operations and the risk significance of these
     operations, are you able to do your own independent --
         MR. LANNING:  Let me defer that to Tom's talk.  That's a
     good issue.
         Okay.  Let me try to speed up just a little bit.  I've
     already talked about the implementation of RPS and weekly planning
     coordination meetings.
         Let's go to the next slide.
         Before each inspection, the inspector prepares an inspection
     plan.  This plan is reviewed and discussed with his or her supervisor
     and also discussed with the DRP branch chief.
         This helps to ensure some consistency among sites and ensure
     that the site strategy, allegations, and other emerging issues are
     considered at the time this inspector goes to the site.
         In addition, the results from these inspections are
     concurred in or issued by the DRP branch chief.
         As Randy indicated earlier, we issue integrated inspection
     reports and, of course, the purpose of that is to ensure that our
     assessment is consistent and integrated.
         Region-based inspectors provide their inspection results to
     DRP, who, in turn, integrate that into the integrated inspection report. 
     On the average, we issue an integrated inspection report about every six
     weeks.
         I've already covered reactive inspections and how they
     impact our planning process.  Now, I would only add that reactive
     inspections are probably the most important inspections that we do to
     maintain safety and obtain performance insights at operating plants.
         Because engineering is so important to safety and the
     assessment of licensee performance, we have initiated a lead engineer
     concept within the Division of Reactor Safety.  This is a senior reactor
     engineer who has additional responsibility of being cognizant of
     engineering programs, engineering performance, and issues that affect
     their site.
         He consults with the resident staff and back in the
     inspection planning discussion, he is key to proposing the inspection
     plan in the area of engineering.
         So this is a concept that we're still developing.  The idea
     model would be to have the equivalent of a lead engineer to a resident
     inspector.  He would know all there is to know about engineering at that
     particular site.
         So that's still developing and we've still got some work to
     do there.
         DR. POWERS:  Do you have this lead concept going so that you
     would have a lead guy on maintenance, a lead guy on operations, a lead
     guy on support systems?  I mean, this kind of concept has worked well in
     areas of maintenance, I know, but I'm wondering if you've got other --
     if you've got bigger dreams here than just the engineering.  Engineering
     seems like a good place to start, because it's tough.
         MR. LANNING:  It's a tough area to start in, too.  I guess I
     haven't really progressed to that point yet.  Just implementing this
     lead engineering concept is difficult.  We're still fleshing that out in
     terms of what are the expectations, how does that individual know what
     goes on at the site, how does the integration of lead engineer with the
     residents and DRP staff, how is that accomplished.
         So I guess I really haven't thought about --
         DR. POWERS:  It's an interesting innovation.  It's one I'd
     like to see how --
         MR. MILLER:  We have done this for years in the HP area.
         DR. POWERS:  Sure, that's another specialized area.
         MR. MILLER:  And those folks have done a marvelous job of
     this.  There's been little turnover in that area and the difficulty with
     respect to engineering is that you have --
         DR. POWERS:  Have this big turnover.
         MR. MILLER:  Turnover was asked about earlier, the domino
     effect of residents leaving and all that.  So the ability to have the
     continuity is like somebody said about a system engineer, it takes a
     system engineer two years before they're making a little difference on a
     system, and it's a similar thing here.
         So we've had this concept for some time, but because of the
     turnover and the complexity of the engineering and the numerous -- EQ
     and the fire protection and design and mechanical, electrical, I&C and
     so on, it's harder.  So I think we're always going to be, in this area,
     honestly, approaching the model.  But we'll always be in that mode and
     getting the best of what we can from it.
         I've seen dividends paid from the lead engineers that we've
     had so far.  I think we're getting better integration of the pieces in
     the engineering area than we have in the past.  But are we there?  No. 
     And that ideal spot where you have a quasi-resident inspector
     essentially in DRS for engineering.
         I think to go to maintenance and other areas, I don't think
     it's in the cards.  It's not practical.
         MR. BLOUGH:  But the issue is when the program is -- the
     smaller you get in the program and EP and security are an example where
     there is a lead inspector assigned, but there are three inspectors for
     the region.  So there is not this opportunity.  Every inspector has a
     half or a third of the sites in the region.  So there is not really the
     opportunity of assigning a lead inspector or a smaller number of sites
     and inspecting a broader scope and to get really the benefits that you
     can with a lead inspector.
         But I guess that's kind of my thinking.
         MR. LANNING:  Risk-informed decision-making is becoming
     routine in our daily regulatory activities, and I'm going to skip over
     that to save us some time, as Tom will talk about use of risk insights
     in decision-making later.
         The last bullet talks about significant inspector
     accomplishments.  One of the ways we judge inspection effectiveness is
     by the findings and accomplishments of our inspectors.
         We have a list, we update it quarterly.  I don't plan on
     going through this list, but I'll give you a copy of our significant
     accomplishments for the first quarter of this year.
         But I do want you to know that we're very effective in the
     inspection program.  Hub mentioned in his introductory comments specific
     plants where we have made a difference.  In addition, we have not
     imposed any unauthorized backfits to plants.  We just had an audit from
     our headquarters folks and they gave us pretty good marks in the way we
     handle backfits.
         The next slide, I'll talk a little bit about how we ensure
     quality in our assessments.  We have already talked to you somewhat
     about the senior management supervisory performance for the sites,
     meeting licensee representatives and accompanying inspectors during the
     inspections.
         They also attend exit meetings.  This helps to ensure that
     whatever was said during the inspection is reflected in the inspection
     report accurately.
         These visits do provide perspectives on what are tested
     during the plant performance reviews and helps to ensure consistency
     among sites.
         To help ensure consistency among regions, we've already
     talked about it somewhat, is that there are a number of counterpart
     meetings where we discuss, compare strategies and exchange information.
         We've already noted that there are counterpart meetings
     involving projects, operator safety, licensing, examiners, HPs and so
     forth.
         In addition, Randy already mentioned that the headquarters
     staff participates in our morning meeting and that's another way that we
     get some cross-fertilization.
         We've had an issue within the region for a number of years
     where the Deputy Regional Administrator provides an independent review
     of a sample of inspection reports and this provides a good check on how
     well reports are written, how findings are characterized across site
     issues.
         Finally, one of the most important checks on implementation
     of the inspection program is feedback from licensees.  You've already
     heard that managers -- we make a special effort to solicit feedback from
     licensee staff or managers.
         DR. POWERS:  You can say tell me how we've been doing and
     they say, oh, well, just great, just great.  Are they forthcoming or are
     they suffering from Towers-Perrin phenomenon?
         MR. LANNING:  No.  I think licensees provide candid
     criticism back to us.  I don't think that there is any hesitation on
     their part to share with us their views and observations.
         MR. MILLER:  Wayne is mostly right.  There are some -- there
     are always some reservations, naturally, and I think that's -- but I
     agree with Wayne.  For the most part, and a lot of it is in how you do
     it.  We're very careful to separate out from the meeting proper, from
     the site visit proper, an opportunity for the senior person to speak
     very candidly to us, in a small setting; this is what I see or don't
     see, and that's an obligation.  It's not a matter of discretion.
         On every visit, we have an obligation to seek out that
     feedback and to write it down.
         And we are very sensitive to do it in such a way as to
     recognize human nature and their reluctance to want to come out
     honestly.  For the people who are going to be overseeing that, it would
     be real --
         DR. POWERS:  You don't mind some reluctance, because you
     don't want quixotic or --
         MR. LANNING:  Gratuitous.
         DR. POWERS:  You want things that are thoughtful and
     reflective and whatnot and that are meaningful.  The type of guy, the
     inspector has bad breath or something like that, that's not an interest
     to you.  It's whether he's doing the job the way you want him to do it. 
     That's the question.
         MR. LANNING:  And this feedback is not really not limited to
     feedback from how the region is doing, but also it's feedback to
     headquarters staff.  So it touches a number of bases here and, again, as
     Hub indicated, we document all that and share that with our headquarters
     folks.
         So in the essence of time, let me conclude here and move on
     to enforcement.
         DR. POWERS:  Thank you, Wayne.  And I am interested in
     following up on how you're doing on your lead engineer concept.  It's an
     interesting idea.  So when you think you've got some insights, do let me
     know how it's doing.
         MR. LANNING:  I'll do that.
         MR. BLOUGH:  I wanted to make one other comment on licensee
     feedback.  It's something I have to work on as Projects Director,
     because I'm coordinating the assessments of these plants.  So I think
     you do have to work at it.
         I don't have statistics, but from my being out there, it
     seems that if a licensee feels they're doing well and is confident about
     how they're doing and where they're going, you get more feedback from
     those licensees than the -- you get more feedback on how the NRC is
     doing from those licensees than from the licensees that are struggling
     themselves.
         So that kind of gives me a sense that I have to work at
     making sure that I go about it so I will get feedback from all
     licensees.
         Thanks.
         MR. HOLODY:  Next slide.  I'm Dan Holody.  I'll talk a
     little bit about enforcement.  This is one area that the agency has
     traditionally taken a strong stance in terms of trying to foster
     uniformity.  It starts with the headquarters Office of Enforcement
     that's been designated to provide that oversight.
         They provide detailed guidance in the Commission-approved
     policy, manuals, guidance memoranda periodically to the staff.  We
     provide frequent training on the policy, they have, also.  We allude to
     lessons learned.  Whenever there is a problem or a mistake, we try to
     get at it pretty aggressively and then communicate the results of that.
         You mentioned this particular issue in the allegations area. 
     We recently had an issue with sensitive information being released and
     we immediately did a lessons learned, communicated to the staff the
     results of that, and we also cover this as on a more formal basis at the
     seminars that we talked about earlier.
         In the past year, with all the changes going on in the
     enforcement area, aimed at reducing unnecessary regulatory burdens and
     with the changes that will be forthcoming with the oversight process, OE
     has provided training on four occasions, either in person or by video,
     where the director or the deputy director has provided sessions to all
     the staff in all the regions.
         We have -- I, myself, have spent a rotation in the past year
     in OE.  It turned out my boss in OE at the time was Hub Miller, also,
     since he was out of rotation.
         DR. POWERS:  He told us he's been everywhere.
         MR. HOLODY:  We have daily communications on any escalated
     cases with the Office of Enforcement, both with the director and the
     staff.
         We have panels here on a weekly basis of all cases that
     could possibly be of an escalated nature.  At a designated time, we have
     participation from NRR and OGC and OE involved in those panels.
         We have weekly conference calls with OE in the other regions
     on enforcement issues, it's an enforcement staff conference call.  We
     have counterpart meetings on an annual basis where we get together for a
     couple days.
         Basically, we share the cases, we share any lessons, share
     any mistakes.  If we have a case that's pending, we look for anything of
     a similar nature that we may be unaware of that may occur in the other
     -- may have occurred in the other regions.
         DR. POWERS:  How do you do that?  That's the mental
     integration.  That's probably pretty easy for somebody experience, like
     yourself, but is it easy for everybody to do and can you integrate far
     enough back and far enough afield to do a good job at that?
         MR. HOLODY:  I think a lot of it is, as you say, mental. 
     You remember a lot of what you've processed here in the region.  So if
     you've been in the position for a while, as I have, then that's an
     advantage.  But for somebody that's not in a position like that, you
     basically have to rely upon others.
         And going to the other regions, we are relying principally
     on their recall, recall of people in the office, but we also have an
     enforcement action tracking system, where all the enforcement cases
     going back to the late '70s are in there and we can do word searches and
     searches for cases that may be of a similar nature.  That system has
     been very helpful.
         We have also, in the past two or three years, placed all the
     cases on the web and that allows us to find these things and utilize
     them fairly quickly.
         There is also the added benefit that you asked previously
     about licensees being candid with us.  They get to see these cases and
     they have not been fearful and in many cases of pointing some
     inconsistencies that they see.
         Enforcement action may be issued on a particular case where
     you called it a certain severity level and they come back and say, well,
     why at Davis Bessy and South Texas was this a different severity level.
         DR. POWERS:  Yes, I've seen some of those.
         MR. HOLODY:  So we do have some formal mechanisms, as well
     as the informals, to address that.
         We also have multi-office involvement.  I mentioned a number
     of people that are engaged in the process.  The number of offices that
     are engaged.  OE concurs in or is part of every panel for every
     escalated action that we take.  They concur in most of them.  The more
     significant ones go higher up in the organization, some of them go to
     the Commission.  I'll mention a little later on that we had issued the
     18 civil penalties in the past two years and they were involved in all
     those.  So they get to oversee it and look for any inconsistencies that
     may crop in.
         We do periodic assessments here in the regions.  Besides the
     lessons learned I talked about, we, my staff looks at -- and myself look
     at a number of the non-escalated cases that get issued.  The inspection
     reports get issued by Randy or Wayne's division.  We look for any
     inconsistencies there, any areas where we're not following the policy,
     we may have had a different -- we may not have had the right take on it.
         We've done three of those in the past year, in addition to
     the self-assessments that we talked about.
         Headquarters also performs audits in this area, as they do
     in the allegation area.  We recently had a very good audit by NRR in how
     we're handling allegations here in Region I.
         What have the results been?  I mentioned the penalties.  Hub
     alluded to the IP 2.1.  We had a major civil penalty at Millstone in the
     last couple years.  We've taken action, in addition to penalties, for
     wrongdoing type areas.  We've been very aggressive in that stance. 
     We've issued orders to individuals that precluded them from involvement
     in the industry because of wrongdoing type issues.
         We had a manager at a particular site that was trying to
     cover up a falsification of a surveillance test.  There was conspiracy
     involved, and we issued an action in that.  We had a couple of
     contractors at another site who tried to avoid fitness-for-duty testing
     by circumventing the computer process -- I mean, the computer codes that
     they had access to.  So we took actions in those cases.
         At the same time, though, while continuing to look for
     problems at the sites and we're aggressive in this area to uncover
     problems and take aggressive actions, we've taken to heart the need to
     reduce unnecessary burdens.
         If you look at this last slide here, the little chart at the
     bottom, we have, as you may know, four severity levels of violations. 
     One to three are those that we consider escalated.  Level four are those
     that we considered non-escalated.
         In the last six months, we've issued notices of violation
     for only 12 percent of those level four type violations; 88 of them have
     been called non-cited violations.  They were addressed in the inspection
     report, but because of the licensee's efforts and correcting the
     problem, putting them into the corrective action program, we don't cite
     them for those and, therefore, don't require a response.
         That does relieve a significant burden, and the licensees
     have told us this.  Whenever we issue a notice, we normally require a
     response.  That gets then made a high priority immediately, regardless
     of the safety of the issue.
         Now, by issuing non-cited violations, based on their
     placement in the corrective action program, we allow the licensee to
     prioritize those issues.  We might find 15, 20, 30 violations in the
     course of a year.  They find a 1,000, 2,000 issues in the course of a
     year that get put into the program.  Instead of us prioritizing and
     forcing them to prioritize issues that are at a much higher scheme than
     maybe some of the things that are already in the system, we think we
     have reduced a lot of their burden.
         DR. POWERS:  We saw peculiarities and prioritization under
     the old scheme.  Licensees would jump on something that you had found
     and move it to high priority, higher than other things that they had
     found that were more important.
         Now, the new directions that they're getting is to start
     risk-informing enforcement.
         MR. HOLODY:  Yes.
         DR. POWERS:  One of the things that comes to mind, you find
     a violation of Appendix R, how do you risk-inform that?
         MR. HOLODY:  If it's an issue that would have possibly been
     what we call a severity level one to severity level three, under our
     current scheme, anything that fit that, over the past few years, we have
     -- we get a risk analysis by Mr. Shedlosky or Jim Trapp, who is our
     other PRA.
         DR. POWERS:  We're going to ask him how he does a risk --
         MR. HOLODY:  And I'm going to leave it to him.  Good.
         MR. SHEDLOSKY:  Do you want to talk about that now?
         DR. POWERS:  I'm willing to wait, but I understand.  Here is
     my problem.  I've got nothing on fire risk in my handy-dandy home-grown
     PRA and so I'm interested in what tools you have and what tools you
     ought to have.
         MR. SHEDLOSKY:  We'll discuss that.
         DR. POWERS:  I'm laying in wait for you on that one.
         MR. HOLODY:  He's our expert and will talk about that,
     although I think I, myself, in the past few months, attended a PRA for
     managers course, as I think a number have either taken or planned here
     in Region I.  I don't purport to be able to do the PRA analysis, but --
         DR. POWERS:  Do they cover fire risk assessment in those PRA
     for managers?
         MR. HOLODY:  I don't think they did cover fire protection.
         DR. POWERS:  And I don't think they cover risk during
     shutdown.  And, what, 58 percent of our AITs come from shutdown events?
         MR. MILLER:  I think that the two-week course that our
     inspectors just went through, which is more elaborate, and I just sort
     of walked in and out of that a little bit, I think they got into it, to
     some extent.  But even then, it's still limited with respect to fire
     protection.  It's at a broad level, external events, at least from what
     I saw.
         DR. POWERS:  Fire, for strictly historical reasons, is
     considered an external event.  It's actually kind of funny how that came
     about.  But the one that's being brought to our attention more and more
     often is events taking place under non-power operations, shutdown, low
     power operations, refueling operations.
         And the question that is going to get posed this afternoon,
     we'll go over it, is why aren't you yelling and screaming that I need
     tools to handle this kind of operation, because here is where I'm
     spending my time.
         Okay.  The licensees are bringing it on, they've got tools,
     but you guys don't have tools.  And how can you be expected to carry out
     these missions of risk-informed regulation if you don't look at the
     risks, if you don't have the risk information.  I don't know how you do
     it.
         MR. MILLER:  It gets into the question, and, again, we're
     getting ahead here a bit, but it gets you into the question of how
     rigorous and explicit you make them.  In the strict PRA sense, we know
     that we don't have tools for shutdown, but we do have a sense of --
         DR. POWERS:  Yes.
         MR. MILLER:  We understand the systems and we understand the
     margins there.  So we can have a good discussion on that.  Dan, why
     don't you continue?
         MR. HOLODY:  That's all I had, unless there are any
     questions.
         MR. MILLER:  Just one thing I want to mention, though, on
     this enforcement.  These numbers are, I think, real good news because
     that is very tangible and immediate reduction in regulatory burden on
     licensees and we've gotten very strong feedback from licensees on this.
         One bit of good news, for me, and I, very, very much, as we
     went to this new program, would there be a dropping off in the number of
     issues that were surfacing and identifying and finding in our inspection
     program.
         What I feared is that folks would hear a message that it's
     time to go easy, quote-unquote, on licensees, as opposed to continuing
     to do the job, which is find the problems that are out there; of course,
     do a good job of putting them into perspective and looking at them from
     a risk perspective and otherwise, and ultimately deciding where you bend
     them in terms of violations.
         But the statistics have shown that there has been some
     drop-off.  In this region, there has not been a precipitous dropping off
     in terms of the issues, if you follow what I'm saying.  I'm talking
     about before you get to the stage of deciding whether it's enforceable
     or not, we've continued to have good inspections, making findings and
     critical observations, and I'm very pleased with that.
         We need to continue to work at that, though.
     DR. POWERS:  I think what I have definitely seen echoes your thoughts
     that when I look at PIMs and things like that, I see about the same
     number of entries now as I did before.  I mean, it's not very dramatic
     and it seems to highlight things that I used to see in the PIMs.  I
     haven't seen a big difference.
         The only difference that I've seen is that this goes into
     NCV.
         What always causes pause is a lot of times it's going NCV
     because it's not a repeated violation.  And I say, how does it ever
     become a repeated violation.  I mean, if it's always going NCV, it will
     never come up as a repeated violation because it's only found one at a
     time.  And that's caused questions, in my mind.
         MR. HOLODY:  Well, even under the new program, even if it is
     a repetitive violation, it would still be an NCV if the licensee
     identifies it, because we want to encourage, to the extent possible; if
     they identify it, even if it is repetitive, they put it in the
     corrective action program, they take aggressive actions to address it,
     at that point, it would still be an NCV.
         Now, that doesn't preclude us from coming behind and saying,
     well, wait a minute, they did not correct this condition adverse to
     quality and then cite them for a corrective action type of violation. 
     But if they have, upon identification the second time or third time,
     aggressively addressed it, then we have --
         DR. POWERS:  I think you're right.  That really what you
     would rather put your focus on is not the particular issue, but was the
     root cause analysis and what they set out corrected, done with
     sufficient breadth, rather than too narrowly focused, that's a better
     use of your time -- your, collectively, time -- than it is the specific
     incident.  I think I agree with you and that's a good point.
         DR. LARKINS:  It raises an interesting question, I want to
     follow-up quickly.
         DR. POWERS:  Sure.
         DR. LARKINS:  Do you, the region, go back and take a look at
     the corrective action programs in terms of the way the licensee is
     prioritizing their timely closure of issues?
         Yesterday, when we asked folks at Susquehanna what criteria
     they use for prioritizing their items in their corrective action
     program, it wasn't clear.  It seemed to be somewhat subjective.  I'm
     sure that if it's safety significant, it gets a lot of attention early
     on, but then there are other things which may not be as safety
     significant, but have some risk significance, and it wasn't clear to me
     what the criteria was.
         I was wondering if the regions are looking at the corrective
     action programs in terms of how things are prioritized, the
     effectiveness, the efficiency, timeliness issues, things along that
     line.
         MR. MILLER:  The short answer is yes.  But everything else
     you're going to hear us say today, everything is gravy.  Susquehanna is
     a good example.  Resident inspectors, on an ongoing basis, are to be
     looking at -- all of our inspectors -- are licensees identifying issues,
     are they doing a good job at getting to the bottom of those issues,
     putting them in a good sense of priority, and, thirdly, fixing them with
     a good sense of priority.
         So everybody is doing that at some level.  As we see
     licensees struggling, we will up the ante and we will increase our
     efforts, and Susquehanna is an example.  We did a special inspection at
     Susquehanna of corrective actions under our so-called 4500 inspection
     process and among things that we did in that assessment, we looked at
     timeliness, and we were critical of them in some certain timeliness
     aspects.
         MR. LANNING:  Timeliness and breadth.
         MR. MILLER:  Of what they were doing.  So the answer is yes,
     we do look at this, but it's a graded sort of thing.  It's more intense
     at some places than others.  We hope that there is a certain minimum
     amount, or we know, I feel confident, there is a certain minimum amount.
         But if there are issues, we will tumble to those and we will
     have enough information, where there is a pattern or we're somewhat
     uncertain about how good they are, we'll bring in the people, the
     special team, if you will, or bring in the additional perspective, to
     ferret that out and determine which it is; is it, in fact, better than
     it appears or is it, in fact, as weak as it might appear.  So yes, we do
     look at it.
         And just one additional thing.  You mentioned the judgment. 
     With thousands of items being reported a year at many licensees, there
     is no way to have an algorithm that is quantitative.  There is a healthy
     -- I mean, there is a need for judgments to be made.
         Most of these things are judgments made each day as
     licensees screen the events of the day before and they make priority
     decisions.  And we use our judgment to question and quiz licensees on
     that.  But it's a judgment call on whether it's a priority one, two or
     three.
         DR. LARKINS:  It sort of goes back to Dana's questions about
     keeping track of non-cited violations to make sure these things were
     handled effectively.  They go into the CAP and then they show up six
     months, a year later.  How do you know, how do you judge the
     effectiveness of what was done in the corrective action program to
     address that issue?
         MR. MILLER:  Okay.  Jack, I guess, is next.
         MR. CRLENJAK:  Could I have the next slide, please?  Okay.
         I'm just going to refer a minute back to a slide that --
     we're not going to put it up, but I will mention it, a slide that Randy
     introduced earlier.  It's the inspection and assessment flowchart,
     somewhat simplified, as Randy mentioned, for our overall process.
         I'm going to be speaking on assessment, the assessment block
     of that chart, which is the last block in that process.  You've got
     planning, inspection, documentation, and then assessment.  It's the last
     block, but it's not the end.  As that flowchart shows, it's the point at
     which we use it to supply feedback back to the system and to start the
     process over again.
         Having said that, I just want to mention a little bit on the
     history of the PPR, the plant performance review process, and we've had
     that process for some time, but it's taken on a little bit more
     significance.
         As of about a year ago, the SALP was suspended by the
     Commission, in an SRM that was dated September 15, 1998.  At that time,
     the PPR or the plant performance review process became an interim
     process for assessment, licensee assessment, to replace, in replacing
     the SALP.
         It's an interim process that will be used until we decide on
     what the permanent replacement will be for the SALP process or whether,
     in fact, we go back to a SALP type process.
         But there's two things I want to make a point of with this
     interim process.  Sometimes you use a word or you look at the word
     interim as meaning it maybe not being as significant as it should be,
     but in this case, I want to mention that -- two things.
         It is a very important point for us in our inspection, in
     our inspection program process.  It's important to provide a useful
     assessment process for licensee planning or inspection planning, and
     that's part of the feedback that goes back.  It's a key point in our --
     as was mentioned earlier, in our addressing our resources and utilizing
     our resources.
         And the other point that's important about the useful
     assessment process is that the licensee is going to act on this process
     and act on our assessment.
         It's important that we have a valid assessment, one that
     hits in the right areas, so the licensee can -- so that they also have
     effective utilization of their resources.
         Then the second point with this is that the PPR letter is
     the primary means that we, as a region or agency, use to communicate
     assessment information.  It will remain the primary means until a new
     assessment process is implemented.
         What goes into the PPR process?  We start off with a plant
     issues matrix that we use as an input to our PPR meeting.  That plant
     issues matrix generally is a listing of significant issues that have
     been rolled up from the inspections that have been conducted in the
     previous six months.
         Now, for our first PPR that we conducted this past February,
     it actually included plant issues for about a 16 -- up to a 16-month
     period.  But in the subsequent ones, it would use the previous six
     months of inspection findings.
         Actually, I should correct that a little bit.  It will be
     the significant findings since the last full PPR.  I'm going to mention
     here in a minute that we do have a mid-year, also, PPR that we conduct.
         The plant performance reviews then are derived from the
     plant-specific PIMs or the items that make up the plant issues matrix. 
     These items are in the PPR meeting that we conduct, are used to develop
     plant insights, which then result in the assessment.
         We have an issue with the standardization of the PIM.  We've
     made significant efforts to ensure that it's a standardized product
     between the regions.  We have our tech support team that has taken
     efforts within the region to compare our PIMs amongst the branches and
     amongst the sites.
         We've also had coordination with headquarters.  We submit
     our PIMs to headquarters and they review those with the other regions to
     ensure that we have a standardized product; not only as far as the
     issues that are raising to that level, but the format that's also used
     in that product.
         Moving on to the PPRs themselves or the PPR meetings, we
     have a spring PPR, which we call the full PPR, and that's a meeting
     that's conducted, basically it's conducted over about three days.  We
     allot about an hour and a half or so to each site.  That varies a little
     bit depending on the level of concerns we have with the site, but it's
     approximately about an hour and a half.
         The participants are both DRP and DRS and the senior
     managers in the region.  Generally, the branch chief will lead the
     overall assessment, with the senior resident inspector and DRS
     specialist making presentations on the functional areas.
         Again, the spring PPR or the full PPR is the one that we
     come up with our yearly assessment, if you will.  We then, six months
     later, conduct a mid-year PPR, which Region I plans to conduct ours in
     September.  It can be in the August or September timeframe.
         This mid-year PPR is somewhat abbreviated from the full PPR. 
     In that mid-year PPR, we'll discuss changes or trends that have taken
     place since the last full PPR and we'll also discuss any necessary
     changes in our inspection program.
         The full PPR basically produces a product as far as the
     inspection program is concerned that covers a year.  Again, going back
     to the mid-year, we will do any adjustments at that point to that
     inspection program.
         I've mentioned, I've already covered the PPR meeting
     structure and the conduct of that meeting.
         The senior management packages, we basically have a couple
     of products along with the inspection planning, if you will, that comes
     out of the full PPR.  We have a screening meeting that's conducted with
     NRR headquarters, where we discuss the -- basically use the PPR draft
     letter at that point and the PIM, present the assessment of licensee
     performance.
         And then out of that screening meeting is also the
     determination made on the plants that will be discussed in the senior
     management meeting.
         MR. MILLER:  A bit of an anachronism.  We're not going to go
     back, as you know, to the old SALP and the senior management meeting is
     going to take on a much different task, and you have been briefed on
     that and I think you know.  So at least on that point, in the future,
     we'll be more operating from the data and the program.
         But this has, and the point that Jack is trying to make, is
     that PPR is a continuing process, continuous process.  So it's not each
     thing done independent of the other, but rather a flow that starts with
     the PPR process and feeds the higher levels and to screen out plants,
     just to flow right through, same paperwork.
         DR. POWERS:  I found the PPR extremely useful.  I use them
     in reverse.  I read the PPR and then I go look at the PIM, because the
     PIM is kind of randomly put together and the PPR puts it in a nice
     clarified fashion.  I know what I'm looking for there and, quite
     frankly, those I've read from Region I, I have admired, and I find it
     very useful, far more useful than SALP or anything I get out of the
     senior management meeting for understanding what's going on at the
     plant.
         MR. MILLER:  We're working very hard along those lines.
         DR. POWERS:  I can believe that you work hard.
         MR. MILLER:  There were how many at one time, Jack, 17 at
     one time?
         MR. CRLENJAK:  Yes.  I'm going to --
         DR. POWERS:  I don't know how you do that.  I would have
     thought you'd spread them out or something like that.  But it's, I
     think, a very useful roll-up, as you say, of everything that's in the
     PIM and you can -- and I -- at least I have never been able to find
     anything in the letter that wasn't substantiated in the PIM and vice
     versa.
         MR. CRLENJAK:  As you mentioned, the PPR letter is the
     document or is the end of what we've got in the PIM and it makes sense
     of what's in the PIM.  That's the main role of that product and as a
     product, as Hub was mentioning, we are particularly proud of what we've
     done here in the region on our letters.
         DR. POWERS:  Let me just say, I think you should be.  I like
     them.
         MR. CRLENJAK:  It was a significant effort.  We really put a
     lot of thought into it.  We involved all levels of management in the
     review.  We involved even our public affairs office for their review. 
     We conducted peer reviews amongst the branches to ensure we had some
     consistency there, also.
         So we took it seriously and I think we got an outstanding
     product out of it.
         Again, I want to mention that it needs to be taken in that
     context.  It is the primary means that we use to convey the assessment
     information to the licensee and we use it to send the messages that we
     consider to be important.
         Having said that, we also get some pretty good feedback from
     the licensees on those.  We've got to conduct about six PPR meetings for
     this set of letters and of those, we've conducted two so far.  The
     licensee has generally acknowledged it's a good product, they've agreed
     with it, and we've -- in my discussions with them, I've determined that
     they've taken to heart the issues that we pointed out to them and, in
     some cases, have already taken corrective actions for them.
         So we've had some good results out of those already.
         DR. POWERS:  Sometimes you find -- and I'm working with one
     data point right now -- that the emphasis that I inferred from the
     letter and the emphasis I derived from the licensees were different.  Do
     you see that?
         MR. CRLENJAK:  I guess I don't quite understand what you're
     saying.
         DR. POWERS:  Well, you know, you read the letter and
     sometimes from the language, you get an idea of what the emphasis is,
     and then you listen to the licensee and you get, from his language, an
     idea of where the emphasis is.  He's acknowledging the point, but in a
     particular case that I can think of, he's saying, yeah, we got to do
     something about that, whereas the letter read you've really got to do
     something about that.
         MR. CRLENJAK:  I think there are occasions that we run into
     that.  That's more subtle than if you have a complete disagreement on
     what the conclusions are.
         DR. POWERS:  Sure.
         MR. CRLENJAK:  But for those kind of issues that you just
     raised, that's why we have, I think, one reason why we have our meeting
     with the licensee.  It's our opportunity to present our assessment,
     somewhat in an abbreviated format, verbally, but it's the licensee's
     opportunity then to present their take on the message and what they plan
     to do about it.
         And it's at that point then that we can work on fine-tuning,
     if you will, their level of concern versus our level of concern.
         DR. POWERS:  Sure.
         MR. MILLER:  Dr. Powers, Jack is right, it's the subtleties
     often that become very, very important and I know it may have seemed
     strange on my first chart of accomplishments to have on there that
     business about senior management involvement in the field.  That's a
     huge part of why we put such a big emphasis about having management in
     the field.  It's hard to pick up those nuances and those differences and
     do they really agree, do they see this issue the same scope, size and
     scale that we do.
         Because at the broad level, it's often very -- it's not
     uncommon to have licensees to say, oh, yeah, we agree with that.  But is
     there a real agreement, and you don't know that unless you go out there.
         DR. POWERS:  Let me make very clear I did not think that was
     a trivial point on your chart.  I'm a great believer in management by
     walking around and, in fact, made a note and asked to get a copy of the
     statistics on that, because I think that's essential that you have --
     that the managers here at this table have some visibility with the
     licensees to know what you guys are about and you to have some idea of
     where -- when they use the word "sure," do they mean, "yeah, I'm behind
     this," or is it, "yes, of course, I'll get to that one of these days,
     too."
         DR. KRESS:  What happens if, in one of these meetings, the
     licensee representative says we just don't agree with that finding?
         MR. MILLER:  First of all, we're very happy and we're very
     happy because that tells us that we're doing something right, that
     permits licensees to do that.
         Now, in an enforcement context, that's one place where they
     never are inhibited.  But we discuss it with them.  I think the with the
     expanse that this region takes, and I'm sure the other regions, as well,
     is that we don't hold ourselves out as being perfect and having -- for
     all the reviews, all the stuff that you've heard here, the different
     things we do to try to coach people and have them understand the
     expectations and the like, this is all too complex to think that you're
     not going to have it wrong some part of the time.
         So we try to go out of our way and make it clear to people
     that we're happy to have them disagree and then we can sort that out.
         MR. CRLENJAK:  Along with that, it's very important to us
     that the message be conveyed and that the licensee send the message back
     to us as to whether they agree or disagree.
         The significance of all of this is that the licensee, again,
     has limited resources and we want to make sure that they properly focus
     management attention on things we consider important.  But if, for some
     reason, they have a conflicting view with ours, then we want to discuss
     it, because we don't want to waste our resources and we don't want them
     to waste theirs either.  So that's important that we get the right
     message and that we understand their point of view on that.
         Okay.  Then, finally, I just want to mention, on the PPR
     lessons learned, this was a relatively new -- I won't say a new process,
     because we have had the PPR around for some time, but the utilization of
     it has been somewhat different and it's taken on a different level of
     importance, if you will.
         We had a lot of changes going into this as far as the
     guidance that was provided for developing the letters, conducting the
     meetings and running through the process.  We had a lessons learned
     meeting after we conducted our meetings and we spent some time putting
     those things together in a written format.  Then headquarters also
     conducted a lessons learned effort, if you will, where all the regions,
     all four regions provided them with input.
         Their findings on that are the final results, it's still in
     a draft form, but they will soon have that out.
         But we learned a lot from the process and we've already made
     some changes and we'll conduct our next meetings a little bit
     differently because of it.
         That's all I've got.  Any questions?
         MR. MILLER:  We're running, I think, a little bit behind
     schedule.
         DR. POWERS:  Yes.  My manager of protocol beat me around the
     head and ears and told me to restrain myself.
         MR. SINGH:  Part of my job.
         MR. MILLER:  I wanted to suggest that we might move on to
     the next topic.
         DR. POWERS:  Sure.
         MR. MILLER:  Which is operator licensing, and Wayne Lanning.
         MR. LANNING:  It's Rich Conte, our branch chief, who has
     responsibility for that, will talk to that briefly.
         MR. CONTE:  Good morning.  I'm Rich Conte, Chief of Human
     Performance and Emergency Preparedness Branch.  Part of my human
     performance responsibilities are to manage the operator licensing
     program in Region I and I am the delegated licensing authority for Hub
     Miller, overseeing the reactor operator and senior operator licenses.
         What I'd like just to do today, in maybe five or ten
     minutes, is to focus on one or two specific issues from the pilot
     program.  The pilot program, of course, was the NRC allowing facilities
     to develop their own exam.  That started at the beginning of fiscal year
     '96 and is coming to a close here, with the new rule being effective
     October 20.
         Most of the discussion here, though, I'd like to focus on
     licensee and NRC staff initiatives on those two issues, primarily the
     quality of the facility-developed exams and somewhat ancillary is the
     examiner consistency issue.
         I think clearly the two issues are related, because if
     facility licensees perceive a region has a consistency problem, we're
     not going to get as many licensees to volunteer for the exam and I
     believe the agency firmly believes that the facilities are in the best
     position to develop the exams.
         Again, I believe that those two issues are my foremost
     challenge in the next six months and in the next fiscal year.  I'm
     getting very strong support from management in this area, especially
     with the acceleration of workshops, that I plan to talk about, for this
     summer.
         Again, I need to maintain my focus as we do all this, that
     the primary directive is to assure safe operators at the plants, in
     Region I, at least.
         Can I have the next slide?  I do have it, okay.
         By way of background, before we go on to the initiatives,
     very briefly, there have been some exam quality problems here in Region
     I.  The specific problem that I alluded to in the former slide was Hope
     Creek, and Hope Creek caught our attention because it was repetitive. 
     The two issues there on the poor quality of the submittal and a high
     initial failure rate were repetitive from a February 1998 exam.
         There were other exams that had problems, caused some
     delays, and in other cases, they were not.  But I think, for the most
     part, I support what the program office conclusion is, that it is a
     viable process that the facilities can develop a good exam and I think
     we're in the process of fine-tuning that process with the workshops that
     we have planned.
         The staff did conclude, in the -- pardon me.
         DR. POWERS:  I was just curious about these workshops.  When
     do you plan those?
         MR. CONTE:  Specifically, the one on process changes is
     August 17 and 18 in the Philadelphia region here, and we have a unique
     workshop that we're planning on testing principles.  It's really the
     local organization here is called MANTG, Mid-Atlantic Nuclear Training
     Group, and that has contracted with a vendor to produce a testing
     principles workshop.  It's more of a training session, if you will, but
     it's in a workshop forum where people will practice, what have you. 
     That's scheduled for the week of July 26.
         The paperwork has submitted, in order to get a -- not an
     endorsement, but at least a review of the agency to ensure that the
     training material comports with what's in our examination standards.
         Again, for the Hope Creek Exam, there was some improvement. 
     The simulator, from February of '98 to December of '98, the simulator
     scenarios and the portion of the operating test called job performance
     measures, really, we did not have any problems in that area and that
     showed some improvement.
         However, the other areas are the traditional problematic
     areas, and that's the so-called written exam and the job performance
     measure questions, along with the admin section, because they really --
     those problems -- I say traditionally they're problems because we have
     some unique standards in that area.  We have a multiple choice format
     and we strive very diligently, in open reference questions, to get that
     higher order cognitive level being tested of the operators, because we
     believe that that equates to a safe operator.
         So in light of that background, we did do a self-assessment,
     the licensee, PSE&G did a self-assessment on why this was repetitive,
     and clearly there were some programmatic weaknesses and inadequacies.  I
     don't want to go into too much of the details.  Why they were repetitive
     and why they were not found, I think, we're still looking at in the
     first case, in February of '98, and we are planning a follow-up, a
     training program review in October.
         We did a self-assessment of ourselves in that area, also,
     and that self-assessment confirmed that our exam team did identify the
     problematic nature of those exams, but there were some misapplications
     by the exam team and there were some miscommunications.
         The misapplication, for example, being a BWR, the examiners
     were surprised that none of the walk-throughs would go into the reactor
     building.  The requirements of the standard are go into a radiological
     controlled area.  The way they had designed this test is it just went
     into the radiological controlled area.
         Now, given that 80 to 90 percent of the mechanical safety
     equipment is in the reactor building, the examiners felt that it was
     important to do that, and the licensee complied with that, but
     technically there was a misapplication.  All they had to do was produce
     an operating test that went into the radiation controlled area.
         So with that background, we plan on working in unison with
     or conjunction with the industry.  In fact, there was a meeting on June
     3 with the Nuclear Energy Institute, where we kind of set the stage on
     how we're proceeding nationally and locally, and the major decision out
     of that meeting was to do, the second bullet there, the local workshop
     on process changes and other issues, early, at least to get them done by
     the end of August.
         And I understand Region III is having some problems
     coordinating that and we're helping Region III by getting some of their
     licensees into our workshop, and that may be of interest, also, to see
     some differences, if you will.
         As I said, the local workshop -- in fact, I just came from a
     supervisor on site visit at Indian Point 2 and I went over to Indian
     Point 3.  The Chairman of MANTG was there and we basically settled on
     the agenda.  They're very happy with the agenda.  It's a two-day
     workshop, with breakout sessions.
         And one of the mandates from the rule change package was for
     the industry to give us some ideas on why the quality problems have
     continued to exist.  In fact, our statement in the rule change package
     was to say that quality has not substantially improved and the industry
     has been tasked with looking at that.  Part of this workshop is to
     explore underlying causes and reasons for that.
         The national workshop may occur in the first quarter of the
     year 2000.  It's dependent on what kind of results we get at the local
     workshops.  In fact, I believe Region IV is doing theirs next week.  As
     I said, Region III is partitioning themselves to one and two, and I'm
     not sure of Region II's date, but I think it's in August, also.
         So I think from a national effort, I think we're working
     closely with the program office and the industry to have a game plan and
     assuring a consistent process in implementing the final revision of
     NUREG-1021, our examination standards.
         Could I have the next slide, please?  That is the last
     slide.
         Again, MANTG is our Mid-Atlantic Nuclear Training Group. 
     I've been working closely with them and, again, we have this unique
     aspect of a testing principles workshop the week of July 26 and we are
     getting some national attention on that and some Region III people are
     coming in to sit in on that course.
         I will be sending an examiner at least the first part of
     that week.  The week is split into two, handling about 20 to 25 people
     in one portion and then 20 to 25, it's repeated in the second portion,
     and I'll be sending an examiner there.
         Again, it's part of our mandate to participate in workshops. 
     The purpose of the examiner would be to obtain questions or comments
     where they're not directly answerable in the standards.
         One of the missions on a national level is to compile a list
     of questions and answers from the final Rev. 8, industry review of the
     final Rev. 8, and that's the first of many workshops, I think, that will
     be compiling those questions and answers, and it will be documented in
     perhaps a NUREG.
         With respect to enhancing examiner knowledge and abilities,
     I have free time on the schedule for my examiners to self-study the new
     revision.  We also have a version of a NUREG in a red-line and
     strike-out, so you can see what the changes are.  I have assigned
     certain examiners to be experts in those standards; in fact, they're
     grouped in pairs.  One is an expert in the development standard, 401,
     for example, which is the written exam, and then the paired examiner is
     an expert in the 402 and 403, which is the process.
         So those people will be available at the workshop to answer
     questions that directly relate to the standards.  Interpretation
     questions will be reserved for the program office and what have you.
         I'm also planning on conducting an in-house workshop, where
     we'll go over some of their presentations for the August workshop.  That
     in-house workshop is scheduled for July 29.
         And one of the big challenges is to make sure that our
     examiners distinguish the requirements versus the enhancements of the
     standards.
         With respect to substantial industry interface, I think I've
     mentioned a number of areas.  I have also set up with the Region I
     Professional Reactor Operators Society, in October, to talk to their
     representatives and discuss some of the changes applicable to them as
     applicants and operators in the requal program.
         On implementing lessons learned, I think one key thing is
     we've changed our scheduling practices to allow a three-week gap between
     the preparatory review process and the actual exam.  That was one of the
     lessons learned from the Hope Creek problem, that we put ourselves under
     a lot of undue pressure to get things fixed before the scheduled exam.
         But the three weeks there, I can use the supporting
     examiners in other missions; again, effective staff utilization, like
     other exams or other inspections.  While the chief examiner will
     maintain the focus on that particular project and act for me and assist
     for me, because I have responsibilities to leave the office either for
     training, supervisory reviews on-site, and other mandates on my time.
         So I don't think we'll lose any efficiency there with the
     chief staying in the office at that point.
         We're also using enhanced examiner comment forms in the
     examination standards and those forms will be going through me before
     they go to the licensee, so that I can understand what comments are
     being made.  One of the things I will be looking for is whether or not
     these are enhancement type questions versus requirements of the
     standards.
         And I have to say, in dealing with the licensees, they're
     not adverse to enhancement type comments, because they recognize the
     examiners see things, as an objective set of eyes, if you will.  They're
     very willing to do that, but when it's couched in the form you have to
     do this, it takes on a different tone.
         DR. POWERS:  Yes.
         MR. CONTE:  And, again, I think I need -- a big lesson
     learned for me was enhanced supervisory review of examiner comments, at
     least on the self-assessment at Hope Creek.  There were a number of --
     we had working level meetings, if you will, with the licensee and I had
     only -- I reviewed enough to say that we've got a problem here, we need
     to bring them in, but, quite frankly, I allowed myself to be distracted
     and there were some comments that the licensee kind of rejected, and
     they were right for rejecting, if you will.
         And, of course, the big thing is, the final bullet there is
     feedback to the program office.  All these workshops and any of the
     interactions, we have weekly conference calls with the program office,
     set time, Tuesday at 2:00, in which we discuss these issues, and this is
     one of the -- in fact, this is one of the more effective program offices
     in the agency in terms of matrix, technical guidance to the people in
     the region.  Even though I report to the management in Region I, I have
     a technical, strong technical line, almost dotted to solid, with the
     program office.
         MR. MILLER:  He stole my line, and that is, I think Rich
     works more for the program office.
         DR. POWERS:  I think he was telling that.
         MR. MILLER:  And that's the way it should be.
         MR. CONTE:  I just want to add that I do have a strong
     support of management, because management was giving me gentle reminders
     to get these workshops done even before the rule change package was
     approved by the Commission, which means we set these conferences up not
     knowing that October 20 was going to be the effective date.
         That concludes my presentation.
         MR. MILLER:  Thank you, Rich.  We're at the point where we
     had a topic called staff training and development and I think that's
     area we can cover fairly rapidly.
         MR. SINGH:  You can do it now or you can do it after lunch.
         DR. POWERS:  The suggestion to me was to break for lunch now
     and come back to that.
         MR. SINGH:  That's fine.
         DR. POWERS:  I'm happy.  I'm longing so much here that I'd
     just keep going, but I think we'll go ahead and take a recess for the
     lunchtime now.  How long?
         MR. SINGH:  Let's start in 45 minutes, make it 12:35.
         DR. POWERS:  Okay.  We will recess until 12:35.
         [Whereupon, at 11:50 a.m., the meeting was recessed, to
     reconvene at 12:35 p.m., this same day.].             A F T E R N O O N  S E S S I O N
                                              [12:42 p.m.]
DR. POWERS:  We will come back into session.  I guess we're turning to
                                               Mr. Joyner.
MR. JOYNER:  Indeed.  Thank you.  I'm Jim Joyner.  I'm the Director of
                      the Division of Resource Management.
During the past three years, in particular, the region has had an
aggressive recruiting program.  We've been in the position of having a
high turnover, largely because of something that you've heard about
earlier today, which is we've attracted very high quality employees and
when you have high quality employees, they're also attractive to others.
In particular, in the resident inspection staff and those that we have
hired to work their way into the resident inspection program, they have
been the target of not only other regions, but headquarters and the
                                        industry, as well.
During the past three years, for example, we have hired 31 individuals
for the reactor inspection program, but, during that same period, we've
lost 30.  Sixteen of those 30 relocated to other NRC offices or to the
IAEA, several retired, and we had at least nine leave to accept industry
                                                positions.
Of the 27 -- of the 31 that we hired, 27 came to us with essentially no
NRC inspection experience and, therefore, our largest challenge was to
                   train them, qualify them as inspectors.
Our process included a lot of senior management involvement at several
steps during the process.  For example, one of our senior managers, Mr.
Crlenjak, went to the technical training center and participated with a
group of our inspectors during one of their training sessions.
By doing so, he was able to provide views of the relationship between
the training that they were receiving and the actual responsibilities of
an inspector.  He was able to establish a very good one-on-one working
relationship with the inspectors who were part of that particular class. 
   He showed management support for the newly hired staff.
In addition, on numerous occasions during the first year of employment,
the new staff participated with experienced inspectors and with senior
staff in study groups that were designed to highlight situations that
              inspectors encounter and how to handle them.
In addition, we provide a mentor for all of our new inspectors. 
Normally, a mentor is at a supervisor level or higher, but the one
characteristic that we look for is extensive NRC experience in the
                                       inspection program.
The purpose of the mentor is to help steer the employee through the maze
of processes and procedures, to provide support during the inspector
qualification process, and to offer guidance on career development.
I want to stress that the training program involves much more than
simply classroom instruction.  Almost from the beginning, inspector
trainees are assigned to accompany experienced inspectors in the field
and they take on increasing responsibility for performing parts of the
inspection in their area or areas of expertise, under the leadership of
                                the experienced inspector.
                        Can I have the next slide, please?
Each new inspector candidate is provided with an inspector qualification
journal, and this is typical of the one, and this particular one is for
the reactor engineering support inspection group, and it contains the
criteria and the guides are useful to becoming an inspector.
Completion of the qualification and program involves attendance at
required courses, self-study, the on-the-job training that I talked
about a moment ago, and, finally, an examination by an oral
qualification board that's comprised of managers and experienced
                                               inspectors.
Even past qualification, we continue with the employee development. 
While I will talk more about this in a couple of moments, I'll point out
at this point that this includes periodic meetings with the supervisors,
staff members -- or during periodic meetings with the supervisors, staff
members establish both short-term and long-term goals and then we try to
provide the experience and the training that's necessary to achieve
                                              those goals.
There are a number of examples that validate the process that we've
established for developing our staff.  For instance, several of our
newly hired staff have made important contributions to the
identification of safety-significant findings.  One inspector, for
example, identified a problem with electromatic relief valves at a plant
    that resulted in the valves being declared inoperable.
At another plant, inadequate implementation of temporary modifications
was noted by a new inspector.  And on a third occasion, a new inspector
identified fire protection problems in a critical section of the plant.
As further examples of the quality of our new staff, ten of the 31
inspector candidates that we've hired in the last three years already
have been selected for resident inspector positions including four who
                           were selected by other regions.
So we got to do all the training and qualification and they came and
                      picked the fruit and took them away.
DR. POWERS:  What you're saying is they're really much more efficient
                                             than you are.
MR. JOYNER:  Much more efficient, than when you hire from the outside. 
                     This was as sensitive point with Hub.
DR. POWERS:  I got a hint that maybe that was a sensitive point.
MR. JOYNER:  In addition, six of our newer hired folks were selected for
positions in headquarters or, as I indicated before, the IAEA.  We're
also proud of the fact that as a result of our management attention to
the process of qualifying inspectors, we have been able to reduce the
qualification time from as much two years, as it's been in the past, to
an average of about one year, for this most recent group of people that
                                                    hired.
A few moments ago, I promised to discuss further the -- our initiatives
for the staff development.  While the NRC provides a variety of courses
designed to maintain and enhance inspector skills, Region I has taken
several initiatives to further strengthen our staff's capabilities and
                                               experience.
First, as you've already heard, we refocused our former resident
inspector counterpart meetings to what we now refer to as inspector
seminars, which provides a much more intense experience, and it also
encompasses not only the resident, but other inspectors, as well.
During these seminars, we include guest lecturers, which specifically
include our stakeholders.  We've had a representative of the Union of
Concerned Scientists, we've had licensee representatives come and
                               address our group, as well.
We use these meetings to share good inspection techniques, provide
lessons learned for those instances where we haven't done things as well
as we would have liked, and we afford the opportunities to examine and
question existing and planned inspection program content.  This has been
a good way of sharing information and obtaining feedback from our
         current inspectors on the new inspection program.
The seminar, as you've already heard, is developed by a team of
inspectors and support staff, working with management representatives to
              assure that the needs of the region are met.
Further, last fall, Region I cosponsored, with PECO Energy, an American
Nuclear Society workshop.  The workshop, which was widely attended by
Region I inspection staff, provided an excellent opportunity for
co-facilitated discussions with stakeholders on the conduct of
inspections, communication of inspection findings, and on enforcement
issues, all areas that are important to NRC and its stakeholders.
As a third example, next week, Region I staff are actively participating
in the annual Health Physics Society meeting in Philadelphia.  These
staff members played key roles in planning and organizing the meeting
through the local chapter of the Health Physics Society and they'll have
opportunities for interaction with our stakeholders again throughout
that meeting.  In fact, some of our staff are teaching during the summer
school that precedes the annual meeting that's being conducted this
                                                     week.
Then, in addition, because of our continually changing environment, and
you've heard about change on several occasions already today, we're
taking the initiative to provide training on managing change to all of
our employees.  We've already had some sessions and we have further
                    sessions scheduled in the near future.
We're taking this initiative as a team to examine the issues that we all
must face, without waiting for the specifics of everything that's going
to be changed.  It's the process that we're looking at and how it
                                      affects individuals.
DR. POWERS:  I think one of the lessons that's come out of the
industrial experiences is training and management change actually pays
for itself.  Just without even knowing what the change is, it's how to
                                                handle it.
A lot of the guidelines, they're common sense, but they're common sense
that you have only after you think about it, and nobody ever has time to
think about it.  I think that's been one of the lessons learned.  Just
informing employees with how to handle change is a good step in changing
                                                 cultures.
MR. JOYNER:  We think it's worthwhile for us and certainly our
management team has had a facilitated session on managing change and
we're reaching out to provide this to as many of the rest of the staff
                                                as we can.
Then, lastly, as an example of some of the things that we've provided
for our staff, based on feedback from the inspectors, we saw the need to
provide current information on water chemistry to some of our folks, so
that they could better understand the issues related to reactor system
corrosion, fission and activation, product transport, and fuel status
                                               assessment.
Accordingly, we arranged a one-week BWR/PWR chemistry course, which was
taught early this year here in the Region I office, to about 20 of our
                                                    staff.
DR. POWERS:  Nothing is better for the soul than a little training in
                                                chemistry.
                                               [Laughter.]
                               MR. MILLER:  Good feedback.
MR. JOYNER:  We continue to look forward to our challenges rather than
back on our successes and one of the challenges relates to continuing to
provide management attention to assuring development of our staff,
particularly our newer staff, since, at this moment, about 15 percent of
our inspection staff has fewer than three years of experience as an
                                                inspector.
Further, the broad changes the agency is making translate into a lot of
training in new programs.  We're continually challenged to provide this
training without adversely affecting our inspection program, while, at
the same time, implementing numerous technological advances made by the
NRC, such as the increased emphasis on PRA, the conversion to the
electronic record-keeping system, ADAMS, that we're currently
                                             experiencing.
DR. POWERS:  I think that all goes to the question whether that's
                                          progress or not.
MR. JOYNER:  It's absolutely progress.  How's that?  All right.  That
concludes my presentation.  If you have any questions, I'd be delighted
                                    to try to answer them.
                    DR. LARKINS:  Just one quick question.
                                    MR. JOYNER:  Yes, sir.
DR. LARKINS:  When we were at Susquehanna, the resident, I think the
SRIs are relatively new inspectors, but they have nuclear power plant
                         experience, one was an SRO and --
                                         MR. JOYNER:  Yes.
DR. LARKINS:  Is that where you're trying to bring in people with plant
                                               experience?
MR. JOYNER:  Absolutely.  We always look for a mix, because you always
have the need to bring in some kind of entry level people that you can
develop as you go along, but clearly our emphasis in the last three
years has been on hiring experienced individuals and that's the focus.
MR. MILLER:  I'm impressed and proud that we've picked up and brought
in, in the last two years -- Alan Blamey is a good example.  He's one of
the residents up at Susquehanna.  I dealt with Alan when he was the
                system engineering manager at Quad Cities.
So we've gotten a lot of talent, top talent, and -- but it has to be a
                                                      mix.
DR. POWERS:  I take it that you're not advertising this to the other
                                              regions now.
MR. MILLER:  Well, I'll tell you, I don't want to sound as if we are
going to be completely leaving ourselves at their mercy.  We, if you
look at it, staff all of our N-plus-one positions in this region.  Now,
we are -- as those are opening up, like Limerick recently, we won't
back-fill that, because it's a top performing plant, and we know that
when we staff N-plus-one at a good plant, we're having to take those
resources and have those people do inspections at other plants.  There
                                  has been plenty of that.
We did that partly out of self-preservation, honestly, to have our
people who come in and aspire to be resident inspectors, to see that
                      they've got a path up in the region.
So I'm happy to say that the state of hiring from outside the region has
    passed.  I think we've gotten over that, to an extent.
DR. KRESS:  Is there a limit on what's to be paid to inspectors?  I
  mean, is it level, a GS level that they can't go beyond?
MR. MILLER:  Yes, there are limits, but the agency has dealt with this
issue of pay, at least as far as resident inspectors is concerned,
again, by giving them now cost of living, which is something that they
                                   previously did not get.
MR. BLOUGH:  They have a special scale that's about three steps higher
within a government grade for being a resident, but they didn't have
full locality pay, which is being restored.  Four days ago, it's been
                                                 restored.
MR. MILLER:  And we give them bonuses now, fairly good size bonuses, to
make a move.  So I don't know that pay has been a big issue for us
really.  The biggest push in this area was about two years ago.  We were
really, I would say, almost extremists in the region because of what Jim
talked about, a loss of -- a lot of the people who are working on these
programs down in Washington right now are former Region I people.
  So I think we're in reasonably good shape at this point.
            DR. UHRIG:  It helps to have alumni out there.
MR. MILLER:  Yes, it does.  It comes back and pays dividends.
The next area is the area that I talked about at the beginning, the
importance of our stakeholders, the public, and how we deal with the
 public.  Diane Screnci, who works for Bill Beecher, is --
DR. POWERS:  Headquarters spy here in the organization, right?
MS. SCRENCI:  I am Diane Screnci.  I'm the Senior Public Affairs Officer
  in the region and I'm pleased to talk to you here today.
I would also like to introduce, before I begin, Duncan White.  He is our
State Agreements Officer and he is here to provide assistance in
 answering any questions you may have on state activities.
The NRC is, of course, a highly visible agency, overseeing a highly
visible industry, and while that's true across the board, this region is
the subject of intense scrutiny by the public, the press, public
interest groups, state committees, the financial community, and
                                                 Congress.
As a matter of fact, some of the financial analysts that we talk to on a
regular basis marvel at the media attention this region receives. 
Financial analysts and business reporters sometimes call more than once
a day for information on what's going on at nuclear plants in the
                                                northeast.
Just some examples, there are newspapers in New England with reporters
assigned to the nuclear beat virtually full-time.  The Day of New
London, Connecticut probably writes more inches on nuclear in a year
than any other daily newspaper in the country, except maybe the trade
                                                    press.
                                     Following the time --
DR. POWERS:  I think they even interviewed me, as a matter of fact.
                                  MS. SCRENCI:  Excuse me?
DR. POWERS:  I think they interviewed me in Albuquerque one time.
  DR. LARKINS:  Even more than the Cleveland Plain Dealer?
MS. SCRENCI:  Yes.  The Cleveland Plain Dealer probably does more longer
in-depth stories, but the New London Day writes a story a day.
Following a Time Magazine article on Millstone, there were Connecticut
television stations calling two or three times a day, trying to ensure
that they didn't miss any development in the continuing story.
For the most part, they were playing catch-up.  They hadn't been
following all along and then the story broke, so they were trying to
   guarantee that they didn't miss anything in the future.
While that has tapered off over time, an issue at one of our plants can
bring about the same level of response rather quickly, particularly in
                                              New England.
We have an educated public in this region.  There are numerous groups
and individuals in contact with us on a routine basis.  They try to stay
on top of what's going on at the plants and try to determine the
importance of the issues.  They read inspection reports, event
notifications, preliminary notifications, and other correspondence
                       between the agency and the utility.
In an effort to make some of these documents more accessible to the
public, we put a lot of them on the NRC's web page, in addition to
putting them in the local public document rooms.  Because of the intense
interest in Millstone, a second LPDR was opened up in 1996 at the
Waterford Public Library, which is just up the road from the plant.
Some of the national public interest groups have also had a keen
                    interest in the goings on in Region I.
Congress, as I mentioned, also pays close attention to the performance
of the plants in this region and they often write to the Chairman,
                           requesting further information.
In terms of writing to the Chairman, this region gets more green tickets
                                    than any other region.
As you're probably aware, the NRC has a longstanding practice of making
available to the public, mainly through the media, information on NRC
activities that's accurate and timely.  With the internet, we can make
            this information available almost immediately.
We encourage our professionals to be open to news and public inquiries
and to try to be as helpful as possible.  It's not only good policy to
be candid, within reason, but it also serves our interest of maintaining
credibility as an agency whose primary mission is to protect the public
                                        health and safety.
DR. POWERS:  I guess I'm just stunned that they would be paying such
close attention to event reports and inspection reports and
notifications and things like that.  I mean, these are written in
incredibly arcane language.  Can they really read these things?  I mean,
I have a hard time understanding what you're talking about sometimes,
                           and I'm supposed to be able to.
MS. SCRENCI:  A lot of that information is available on the internet and
we get a lot of calls when there is an event report about a particular
plant or an inspection report and we try to explain -- if it's not
already written clearly enough to be understood, we try to help people
understand it, either by the public affairs officers answering those
questions or the technical staff talking to members of the public and
                                  responding to inquiries.
DR. POWERS:  Typically, they are in absolutely no context.  I mean, I
would -- I can read an event report and be convinced the plant has now
                          sunk three feet into the ground.
MS. SCRENCI:  And I talk about that later on, the need to put
information into context.  Event notification, in particular, generates
a lot of interest, because there is no context to them.  It's very
preliminary information.  It's what was called in, it's unevaluated.  We
do get a lot of calls after some of those to find out how important this
                      was, what type of incident this was.
With the internet, a lot of that information gets circulated, too, with
editorial comments by whoever is circulating it, things like "look how
               bad this was."  So we do see a lot of that.
But with the internet and with the information available, we do get a
                 lot of calls on those types of documents.
MR. MILLER:  Dr. Powers, I think you're putting your finger on the
single biggest challenge, in a sense, that we have.  We do an
outstanding job of letting it all hang out, if you will, of putting out
information, and, in fact, it's going to get better, in many respects,
if you're an interested citizen, with respect to the new program. 
You're going to be, almost real-time, be able to dial into the internet
                            and see what color a plant is.
But this question of how you take an event and what is its meaning, how
does the layman interpret this, it reached a peak, of course, at
Millstone.  Wayne Lanning, how many public meetings did you do over two
                                         and a half years?
             MR. LANNING:  Just about one every six weeks.
MR. MILLER:  And those meetings, and I attended a number of them, after
the region reassumed responsibility for Millstone, often, of course,
                  involved a general discussion of status.
But a number of individuals would raise -- have in their hand inspection
reports and LERs, with specific questions on specific lines in those
reports, and I don't mean just in a kind of random way, you know, just
sort of go through it.  It was clear that these people had studied these
reports and taken the time to study them, and that, you might say, is a
        bit anomalous, but not terribly so in this region.
I think it's good news and we don't say it in a complaining way, but it
is a huge burden for us.  It's a challenge.  It adds to the challenge of
                                      managing the region.
DR. POWERS:  One of the biggest challenges that I've seen going to
risk-informed regulation comes about when we compare what's
risk-significant with what's in the deterministic regulations.
Clearly, in the area of fire protection, everybody is acquainted with
fire as a threat to a facility.  It's a threat to your home, a threat
where you work.  Everyone is acquainted with the measures that are taken
                           to protect against that threat.
When we do a risk assessment, we come in and say, well, take something
that probably hasn't come up too often in this region, like fire barrier
penetration seals, when you start looking at them from a risk
perspective, you say, well, there are some of them that are important
and there are some that are not so important, and maybe we should focus
our attention on the important ones and let the unimportant ones kind of
                                                    slide.
The guy says, look, they sound pretty important to me and I don't know
cut sets and probabilities from Shinola, but I do know that fire barrier
penetration seals are pretty important; what are you guys doing, you
guys at the NRC, letting them get away with these things that I know,
            from my own personal experience, is important.
It seems to me to be a very difficult communication issue, that because
you have this barrier of an arcane technology and you have risk
assessment, that you have to communicate in some way in order for them
to understand the context of what you're making the regulatory change.
Do we have a way of communicating risk information to people?  I haven't
found one yet.  The most effective one that I've found is to point out
what you get as an advantage from risk information, before you start
talking about probability and cut sets, because that is a very subtle
concept to grasp if you don't understand what you're getting from it.
I think that's a challenge we're going to have in this risk-informed
                                                     area.
                                     MR. MILLER:  I agree.
MS. SCRENCI:  Yes, I do, too.  I am going to talk a little bit about
               context of things in just a couple minutes.
Here in Region I, we have two public affairs officers.  As you know, we
report directly to the Office of Public Affairs, but we work very
closely with Region I agency managers to coordinate activities and to
ensure that they're aware of what the public is interested in.
We have a good rapport with the staff here.  We can easily find the
technical information, somebody to explain it to us, and where to find
someone to talk to, a reporter, member of the public directly, and,
likewise, we try to provide as much assistance as we can to the staff,
either by reviewing letters or documents, returning phone calls,
                                        offering guidance.
In sum, our outreach to the public is a team effort here.  We have
formal and informal ways of keeping the public informed.  Probably the
most visible is press releases.  We put those out to pertinent media on
enforcement actions, meetings, other agency actions.  Typically, we fax
                        those out to the interested media.
In addition, the public affairs officers will often call reports or a
member of the public to let them know that something newsworthy is
happening, and those press releases are also placed on the NRC's web
                                                     page.
Just some statistics, I know everybody likes statistics.  In a typical
week, the news and information page listing the latest news releases
gets more than 9,000 hits.  Specific news releases are viewed an average
of about 450 times in a week.  Just to be even more specific, for the
week ending June 12, the news and information page had 9,448 hits, and
 the top seven news releases had between 368 and 529 hits.
So there is a lot of activity on that page.  It was a Region I press
                  release that carried the lead that week.
Our efforts to communicate with the public and press certainly don't end
with press releases.  The Regional Administrator holds periodic press
briefings to talk to local media about performance at the plants and NRC
policies.  We try to schedule those in the areas where there is
                              significant public interest.
There is an enormous amount of preparation involved on the part of the
technical and the public affairs staff in ensuring that the Regional
Administrator is up to speed on all the possible issues.  Some of the
Commissioners also hold press briefings when they visit a site.
In addition, the region holds dozens of meetings with licensee
management each year that are open to public observation.  We'll put out
a note to editors on those.  There is also a complete list of those
                           meetings on the NRC's web page.
There are also meetings in which we're asking for public participation
and where we're meeting with the public to discuss a particular issue. 
For those meetings, in this region, we'll buy an advertisement to make
sure that the public is aware that the meeting is taking place.
One of the public affairs officers will attend those to assist the
members of the public in understanding what went on.  We'll bring
briefing material.  We'll set up interviews or actually do the
       interviews ourselves to help the reporters on hand.
Another important part of our meeting involvement or the most important
part is helping to prepare the staff.  We'll help them go through a dry
run, ask some of the tougher questions so that they're prepared when
they get there.  Our role for meetings is definitely a support role.  We
     do whatever we need to help the meeting run smoothly.
MR. MILLER:  Diane, I want to interrupt here at this point.  They play
an incredible role here and quite an important role in having us
internally see things as the public would see them, and that's amazing,
and we will revise something because of the questions that are asked by
                                           public affairs.
DR. POWERS:  The changes in perception that occur, especially in the
area of risk information, just astound me.  I guess my favorite story is
John Hearns' story on how people reacted to the findings of WASH-1400. 
WASH-1400 concluded the risks were about the same as riding a bicycle to
                    work every day or something like that.
The public responded, those guys think running a nuclear power plant is
              like riding a bicycle, who are these people.
I think risk communication is going to be -- it certainly is one of the
four pitfalls the ACRS has identified in the move toward risk and
regulation that people haven't looked for.  The Commission has asked us
     to look for pitfalls and that is one of the pitfalls.
Of course, then they asked us, well, what should we do about it, and
we're still struggling with that particular answer to it, because it is
difficult, in an arcane industry, to know how other people view you,
          because you get used to using your own language.
So I can fully believe they must realign your wheels pretty good every
                                          once in a while.
          MS. SCRENCI:  We are just too normal to mention.
                                       DR. POWERS:  I see.
MS. SCRENCI:  I do plan to talk about the plain language and improved
writing efforts in a few minutes.  We also, in an attempt to enhance
communications within the region, Neil Sheehan, our other public affairs
officer, puts together a regional newsletter that's very well received
                                            in the region.
Our interaction with our stakeholders also obviously includes the states
and we have two people assigned full-time to state program activities,
the state liaison officer and Duncan White, our state agreements
                                                  officer.
Primarily, they work to maintain a good working relationship with the
states.  They spend most of their time communicating with state
officials in the region, explaining NRC policies and positions,
coordinating training, providing technical assistance, following up on
            incidents and coordinating emergency response.
The state agreements officer also reviews agreement state programs for
adequacy and compatibility and he's also part of the team that reviews
             applications for becoming an agreement state.
            DR. LARKINS:  Those are the IMPEP inspections?
                                        MS. SCRENCI:  Yes.
                                DR. LARKINS:  Assessments.
MS. SCRENCI:  Yes.  In this region, a number of states had their own
inspectors assigned to the reactors.  Many also have advisory panels
that report to various levels of state government.  The technical staff
puts a good deal of effort in interacting with the state panels, either
by providing technical information or attending and participating in
                                           their meetings.
For example, in Connecticut, the state legislature set up the Nuclear
Energy Advisory Council, known as NEAC, in response to the problems at
Millstone.  Those council members range from state representatives to
                          opponents of the nuclear plants.
We have had and continue to have extensive interactions with NEAC. 
Staff members take part in meetings held by NEAC.  NEAC participated in
Commission meetings on the possible restart of Units 2 and 3, and, also,
on lifting of the order for a safety-conscious work environment.
NEAC members observed inspections.  In addition, the staff spent hours
explaining why the agency was doing what it was doing and not doing what
others may have wanted and also explaining the technical details to
                                                     NEAC.
The net effect was that as the panel became more educated, the opinions
of its members modified it at both Commission meetings on restart,
members spoke in favor of the staff's recommendations that the units
were ready for restart, and at the April '99 Commission briefing on Unit
2 restart, a NEAC representative commented about the good coordination
between the NRC staff and the council.  This clearly is a success story
and it's an example of how outreach and interaction pays off.
Similar efforts are ongoing on a smaller scale with other panels in
                                    other states, as well.
Now, I can't emphasize enough that this effort is not limited to the
public affairs and the state program staff.  That will belittle the
efforts of the rest of the staff.  We encourage the resident inspectors
to meet with local officials and to talk to local groups.  Many members
of the regional staff talk to schools and community groups, either
during an initiative or at the request of an organization.
The inspectors are also encouraged to respond to media and public
inquiries.  A member of the technical staff talks with the public, the
media, and state and public officials at meetings.  Many encourage
members of the public to contact them directly with questions and
                                                 concerns.
So we work very hard to explain what we do and why we're doing it.
                         Can I have the next slide, Chris?
Now, having talked about the various stakeholders and their interest in
regional activities, I'd like to now talk about the challenges we face
        and how we are trying to address them in Region I.
Because of the diversity of the audience, it's important to play close
attention to how we communicate to ensure that we get our message out to
everyone, and, to that end, the region has taken several steps.
The public affairs officers provide a training to the technical staff at
the inspector seminar this spring.  We focus not only on the nuts and
bolts of writing, like subject now, agreement, commas and the like, but
also knowing your audience and understanding the big picture and
  accurately telling the story that you're trying to tell.
To assist those report-writers with the nuts and bolts, we've
distributed copies of The Elements of Style and the NRC Editorial Style
Guide.  Both of those are easy to use, little handy-dandy resources.
There is also a group set up in the region to improve the writing.  This
task force is putting together recommendations for improving the written
                                       work of the region.
Perhaps the biggest challenge is explaining technical issues and their
implications in plain English, in a language that the everyday man can
                                               understand.
But we must strive not only to explain the issue, but also to give it
perspective and, most importantly, to put it into context, and we need
            to do that without sounding like an apologist.
Our words and actions must mirror the significance we place on an event
or issue.  So it's a fine balancing act to explain that while a plant's
emergency diesel generators wouldn't work, even if the plant lost
off-site power, the plant could exist on station batteries.  In the
public's mind, that much defense-in-depth should never be called upon,
                     though they do expect it to be there.
So it's important that our explanations be clear and on the mark and
imperative that we not appear to be giving excuses for allowing plants
to operate despite what may appear to be, to the public, flaws that
                       should prevent them from operating.
To address this issue, we've done several things.  The public affairs
officers provide some training to the technical staff on how to respond
to media inquiries.  We sit in on media interviews and help the -- to
              help the technical staff explain the issues.
What's important for us is that we assure that we understand the issues
and we can explain them properly, that we've had the technical staff
give us a good explanation of why something is important and what it
                                 means in the big picture.
And I have mentioned before, we'll help the technical staff prepare for
meetings by reviewing slides and going through the dry runs.  The region
     also provides some media training for its inspectors.
In closing, this region dedicates a good deal of effort to productively
interfacing with our various stakeholders, as does the rest of the
agency, and we recognize that our job is to provide information to our
stakeholders in a straightforward manner, to allow them to draw their
own conclusions, after we've put it in perspective, which is unlike
public relations which spins the information to influence the
                                               conclusion.
Despite our best efforts at maintaining communication with our
interested parties, the bottom line is that we'll only be able to
maintain public confidence by doing our jobs well, which is one of the
conclusions of the communications initiative within the agency, and
that's by inspecting, responding to events, and explaining our actions
                                      to our stakeholders.
That's the end of my presentation, if you have any questions.
      DR. POWERS:  Would you like to explain ACRS letters?
MS. SCRENCI:  No.  I'll be honest, I have never been asked.  I have been
here more than eight years and no one has ever asked me to.  I assume,
                if that happens, those go to headquarters.
  DR. LARKINS:  That's because they're written so clearly.
MS. SCRENCI:  That's what it is.  It needs no explanation.
DR. LARKINS:  A quick question.  Now that we're getting away from the
SALP meetings, and usually there was a public meeting at the end of the
SALP process to explain the ratings.  Is there going to be something to
                                           fill that void?
MR. MILLER:  There will be an annual meeting at every site, once a year.
   MS. SCRENCI:  We're doing it now with some of the PPRs.
MR. BLOUGH:  Let's be clear.  In the new program, there will be a
meeting at each site that will be annual.  Right now, in the interim
program, after having issued the PPRs, which, by the way, the PAO
reviewed all the PPR letters for us and commented, after -- associated
with the PPRs, we do conduct a public meeting with the licensee on a
                             frequency of every two years.
So right now, we're meeting with those licensees who we haven't had a
SALP meeting or a similar meeting with in the past two years, and we'll
be doing those as long as we're in the interim PPR process.
DR. POWERS:  Let me understand the specific issue.  In planning for this
meeting, we were wrestling with when you have a public meeting, how do
you set up so that the public knows about that meeting and can attend
                                             that meeting?
MS. SCRENCI:  In this region, we do several things.  We'll buy an
advertisement in the local paper, because we can't ensure that anyone
will run a story on a news release that we put out.  We'll actually buy
                                         an advertisement.
We're having a PPR meeting or a meeting on the new program at Salem next
week and we bought an ad in The Salem.  We also put out press releases.
If we don't think anybody is paying attention, we'll call those
reporters.  If we've put out a press release and a couple days go by and
we don't see either any news stories or the reporters haven't called, so
it doesn't look as if they're going to come, we'll call them and just
remind them that, hey, we sent out this press release and it may be
something that you're interested in, especially the reporters who
                                        normally cover it.
Our list for the plants are sometimes 40 news organizations, big, and
three people actually cover it on a routine basis, we'll call those
                                  three, not the whole 40.
DR. POWERS:  Do you invite what we loosely call intervenor
                                            organizations?
MS. SCRENCI:  If there are groups that we are aware of, we'll make sure
that they know that the meeting is taking place.  It varies from site to
site.  With Millstone, there was intense interest.  There was a lot of
phone calling involved, like we called lots of people to let them know
                                     about lots of things.
So, yes, the interested public groups, the local officials, we'll call
                                                     them.
DR. POWERS:  I think you must have been very successful in connection
with Millstone, because of the rave reviews you got from the Senator
                                         from Connecticut.
MR. CRLENJAK:  Let me add one more thing to the PPR meeting, that I
stated earlier.  We have about six plants for this current PPR meeting
that we have conducted to them and we didn't -- although we sent out the
meeting notices and our procedures, we still didn't have the turnout,
public turnout we would have liked to have seen.  So we're looking at
the possibility of doing a little bit extra for the next ones.
We've checked with some other regions to see what they're doing, the
possibility of a press release, the possibility of a newspaper ad for
those, things we're exploring to try and get a little bit more of a
                           turnout, public interest in it.
MR. SINGH:  Does the public meeting notice go to the Federal Register,
                                 just in local news media?
                                 MR. CRLENJAK:  I'm sorry?
MS. SCRENCI:  Meeting notices, formal meeting notices, they go to the
public document room and they're posted on the web page, and then
there's a press release or a note to editors which is something
different.  I don't believe it goes into the Federal Register, but I
also don't believe members of the public read the Federal Register.
                                    MR. SINGH:  I'm sorry?
MS. SCRENCI:  I also don't believe that members of the public read the
                                         Federal Register.
  DR. POWERS:  Sure they do, you must be wrong about that.
             MS. SCRENCI:  Then all the public but my mom.
     DR. LARKINS:  It's a requirement for us, but I agree.
                                    MR. SINGH:  Thank you.
DR. POWERS:  Okay.  Fire protection.  Almost as good as chemistry.
MR. RULAND:  My name is Bill Ruland.  I'm the -- I will be the branch
chief again that covers fire protection here in the next couple weeks. 
I'm excited and a little bit apprehensive about talking about fire
protection; excited because I think fire protection is very important. 
It, in the past, has been neglected and I welcome the interest on fire
                                               protection.
Personally, I always thought it's been very important to plant safety. 
What we know about fire specifically, I think, is not up to what we know
                      about how the other systems perform.
I'm a little bit apprehensive because the amount of guidance provided in
fire protection, generic letters, Appendix R, general design criteria,
the fire protection codes of records, it goes on and on.  You really
need to be a biblical scholar to really understand fire protection.
So with that proviso, I'm going to talk here, and I don't have my
biblical scholar with me, but I'll try to answer your questions as best
                                                    I can.
    The next slide -- I'm sorry, that slide was just fine.
I'd like to talk about items and that's, first, IPEEE, that's the
probabilistic evaluation for external events, and, of course, fire
specifically, which is an internal event, as you pointed out.
I will briefly mention the fire protection FPFI done at Susquehanna and
talk in general about where we see that's headed now that the Commission
has ruled on the SECY paper associated with that.  Finally, I'll try to
talk a little bit about what does all this mean to us now as regional
 inspectors, how are we dealing with all this information.
                                       Next slide, please.
As you can see, all the IPEEEs, printed generic letter, have all been
issued.  The first round of requests for additional information have all
been set out.  There are four approved IPEEEs here in this region.
As you can see, Susquehanna, where you were yesterday, is one of those. 
It's interesting to note that the letter, just in the April-May
timeframe, that was finally issued and the Susquehanna fire protection
functional inspection, which basically was almost two years ago, used
            that information when they did the inspection.
       So that's the status of IPEEEs, and we do use them.
                                       Next slide, please.
We believe, and I think it's, from what I understand, how the ACRS
feels, is in concert with that, in that the plant-specific insights are
what's most useful, by and large, when we look at the IPEEEs.  It's
awful difficult to use these to compare individual plants.  As
inspectors, what we use them for is we grab the IPEEE and we go over
them, we use our risk analysis, our SRAs to help us tease out those
                      insights, and we go out and do that.
We've started this process, by the way, right now.  For instance, TMI,
there was a routine fire protection inspection.  Our inspector pulled
out the IPEEE.  Frankly, in retrospect, we didn't really get out the
information that we would have liked.  We got the general, you know,
these are the important areas; cable spreading room, the relay rooms,
                      the switch-gear rooms are important.
                              You know, I mean, clearly --
DR. POWERS:  You could have gotten that without the IPEEE.
MR. RULAND:  We could have gotten that without the IPEEE.  So we need to
go to the next level and tease out what about those areas do we really
need to take a look at, and I'll talk about that a little bit later.
Again, the different assumptions that make really comparisons difficult,
                                      those assumptions --
DR. POWERS:  You did that at Susquehanna.  They have some very cool
assumptions that they use in making their screening criteria, some very
striking assumptions that they use for screening that seems to obviate
                        its utility as an inspection tool.
    MR. RULAND:  They basically assume no operator errors.
DR. POWERS:  They did that and they said we'll look for -- screen out
fire areas based on combustible loading, but we won't count cables, so
you'll never go look at transient combustibles in the cable spreading
                   room.  I mean, you never look for them.
MR. RULAND:  Right.  The cable spreading room at Susquehanna is screened
out.  We're just assuming that there's no fires.  They also assume that
fires don't spread outside cabinets.  There's a whole list of
interesting assumptions, but there are some what I would call
                                 conservative assumptions.
We just heard, at Susquehanna, that thermal lag, they're spending
millions and millions of dollars and they don't take credit for any of
                                      that in their IPEEE.
So you're right, it begs the question, what's the use, how are we going
to use that.  I will -- I'm going to -- once again, like a number of
people have done, I'll pass the ball to Tom Shedlosky for the SRAs. 
They're going to help us do that piece in the future, and I'll talk a
                             little bit about out modules.
DR. POWERS:  You know, I know why you want to pass it down to Tom,
because he's a good man, but I think you ought to be yelling and
screaming, get me tools that are of some use for me, because this one is
not.  I mean, I don't know how to use that IPEEE for anything.  They've
                assumed away the things I'm interested in.
MR. RULAND:  You're right.  I mean, I couldn't argue with that.  A
couple things I'd like to -- you know, we can focus inspections and
there really has to be some judgment and, I would say, some broad
judgment given to the inspectors about what we look at and how we credit
                                                     that.
Nowhere has this agency said that we are going to ask licensees to have
their systems in non-compliance with the code of record, not yet, not in
compliance with Appendix R.  All those things still matter, and we
                                          believe they do.
It's just we hope to eventually have some tools that we could use to put
                              our findings in perspective.
Kind of an interesting that I'd like to mention has to do with the new
fire protection draft module in the new program.  It has the SRAs do
something called a fire risk -- basically, a risk results report.  So
somewhat like the FPFI, we would go out and do a pre site visit.  The
SRA would come with us.  He'd take his insights, would write those down,
and based on his insights and our previous inspection results, develop a
                                                     plan.
I think that's one of the interesting things we've learned about how we
can enhance our inspections.  I think that's the only place in the fire
inspection program that explicitly requires the SRAs to go out and do
                      that.  So I'm encouraged about that.
                                       Next slide, please.
And I'd just say, as we've already alluded to, there is a whole set of
different assumptions that different licensees have taken about how
they're cables are going to be damaged, how the fires are spread, and
all these assumptions can, in fact, affect the IPEEE results.
DR. POWERS:  One of the profound assumptions -- it's not peculiar to any
one IPEEE -- is the dependent probability between fire severity and fire
suppression capabilities.  You treat those as independent.  I'm going to
get you some real low fire risk numbers, if you'll allow me to make that
independent.  If you want me to get you some real higher -- higher risk
            numbers, let me make them totally independent.
                                       MR. RULAND:  Right.
DR. POWERS:  I mean, I can control everything with one cross correlation
number.  Is that a useful tool?  If you gave me control of one number, I
                                  can dictate the results.
MR. RULAND:  If you looked at the -- if you read the caveats in the
IPEEE report, by the time you -- they've basically, in my view, have
provided bounds on what they claim is being performed, I believe the
                                        IPEEE succeeded --
                                         DR. POWERS:  Yes.
      MR. RULAND:  -- in doing what it was intended to do.
                                DR. POWERS:  That's right.
MR. RULAND:  The danger is you then use that information in other ways
and you've got to be very careful when you do that, and that's the
message that I've given to my -- the people that are working for me and
will work for me.  You have to be very careful how you do that.
                             Let's see, where are we here.
DR. POWERS:  And that's the problem we're facing with the whole
IPE/IPEEE thing, is that it had a specific objective and that was a very
low, limited, localized objective.  Now, when people come along and they
say now use the insights from this for some other objective, there were
                                some flaws in the process.
       MR. RULAND:  You're right and we do recognize that.
Just kind of a review.  Susquehanna did come up with some results or
changes.  Those are the changes they've made.  If you look at the -- I
mean, the first two really aren't any great shakes; you know, I mean,
ban people from smoking in the building.  But the second and the third
            -- the third and fourth one are real insights.
As a word of note, these were not vulnerabilities.  If you remember, in
the IPEEE, vulnerabilities had specific definition. These aren't
vulnerabilities.  The licensee really calls them weaknesses,
                         enhancements, and the licensee --
DR. POWERS:  Well, the splash guard was actually a fairly subtle one.
                                       MR. RULAND:  Right.
DR. POWERS:  Good integration on the part of the licensee to look at the
length and depth.  Now, am I correct in my belief that they discovered
the need for splash guards only as a result of an RIA, a request for
                                   additional information?
                      MR. RULAND:  I couldn't answer that.
                            DR. POWERS:  I think they did.
MR. RULAND:  It turns out that this particular site was one of the few
sites that they actually went out and visited.  So Research and NRR went
out there to the site.  Again, like you said, you look at that, the
screening process and the screening criteria was not included in the
IPEEE.  A big part of the report isn't included and wasn't reviewed.
So our SRA, as I see it, will be able to, during these site visits,
                     examine that a little more carefully.
                                       Next slide, please.
Specifically, the Susquehanna FPFI, it was the second one that was done. 
It consisted of five experienced people, mostly colleagues of Jit, and
two contractors from Brookhaven.  From my view, these folks were some of
the most experienced people in fire protection, frankly, in the free
                                                    world.
     DR. POWERS:  I'm not going to argue with you on that.
MR. RULAND:  I'm not exaggerating.  Pat Madden was the team leader.  He
had Ken Sullivan from Brookhaven, people that had been -- you know,
basically had done these at 50 different plants.  So if you put that in
context, that the quality of the people that were here, it helps you
really understand that the findings, based on the quality, were not
                                           major findings.
I mean, really, this was a stellar group of folks that spent literally
months on this inspection and I want -- the idea of quality people
looking at it is an issue I'm going to touch on a little bit later.
Anyway, the next slide, that's just the scope of the Susquehanna FPFI. 
That was a standard scope, basically look at everything about fire
protection, both licensing basis, code compliance, and the FPFI even
looked at things, you know, what other potentialities are there for
fire, regardless of what the regulations are.  Really, an experienced
                                         group to do that.
The next slide, for findings, and this is just a couple of them.  We did
find that Susquehanna had -- you know, their overall awareness was
     increased on fire protection.  That's almost a given.
This licensee had been working long and hard with NRR directly to fix
this hot short issue, which is the second bullet there, and they had
resolved it and they had known long before this inspection that that had
                                            been resolved.
Certain equipment wasn't ready for -- available to fight a fire,
implement mitigating corrective actions.  This was something as simple
as they went to get into their fire brigade and the boots weren't
labeled and everybody had to kind of fuss around to make sure everything
                                                      fit.
I mean, they still met the timeline that they were licensed to, but just
an enhancement.  The key-fill system wasn't properly -- didn't stage the
equipment, the key-fill.  There were C02 testing issues, physicals, a
sprinkler placement, you heard that, and Susquehanna now has a rather
extensive problem -- extensive process to go out and really reevaluate
their entire fire protection code compliance, and I think you heard that
                                                yesterday.
                                         DR. POWERS:  Yes.
MR. RULAND:  So that was the FPFI.  And where do we go from here?  And
the next slide, you may be aware that SECY-99-140 was approved by the
     Commission and basically accepted it without comment.
So the FPFI, as we knew it then, is no longer going to be explicitly
performed.  It will be part of the supplemental program and we'll be
able to call on that if people have fire protection problems in the
                                                   future.
We will use this as part of our triennial inspection and to that end,
the triennial inspection in fire protection is really an increase in the
level of fire protection inspection that have been done previously.
We used to do one week every three or four years and that was it and it
was a routine fire protection inspection.  That's all we could do.
Now we're going to have the residents specifically looking at -- doing
walk-downs, looking at a drill once a year, and every three years, there
is going to be three people that come in and do a focused effort, with
               the SRA's help, using the FPFI as guidance.
And, again, the IPEEE is specifically referenced and with the caveats
that we are going to exercise caution in using that.  And the final
bullet there has to do with staffing.  In fire protection, I believe,
while it's -- it goes without saying that the quality of the staff you
have, doing any inspection is very important.  In fire protection,
because of this breadth of the guidance and the ability to try to really
get your arms around all that is such a challenge, we try to work very
                      closely with the headquarters folks.
DR. POWERS:  The headquarters folks are putting together this grand
general reg guide.  It's going to take, what is it, four more books that
you've given me on fire protection?  I got all these notebooks on this,
all this guidance that's come out over the years, and it's going to
                      collapse it down into one reg guide.
                             Surely, that's going to help.
MR. RULAND:  I know that effort is going on and that's my extent of it.
MR. SINGH:  I just want to make a comment.  We are having a subcommittee
meeting hopefully sometime after the draft comes out, in either early
        September or late October, it's going to come out.
          DR. POWERS:  It might be useful and interesting.
MR. RULAND:  I work with Pat -- Pat Madden was on a rotation assignment. 
For everybody, Pat Madden is one of the fire protection people on the
                                    staff in headquarters.
He worked with me at Brunswick on a rotational assignment for six months
and Pat and I have communicated constantly.  Jit helped us immeasurably
                                              really in --
         DR. POWERS:  Don't say that, it goes to his head.
MR. RULAND:  There is no other area in -- I believe, a technical area
where being close to headquarters and getting their technical agreement
is more important, because of the long history of all these issues, all
the license conditions are different, it's just very complicated.
So we're working very, very closely with headquarters and specifically
in the staffing issue.  We've got to make sure that we get the right
                                                    staff.
DR. LARKINS:  One quick question on that.  At one time, the agency had
lost a lot of its fire protection inspectors and they got dispersed out
through the agency and I guess the last few years, there has been some
           move to try to bring those folks back together.
Is there any implication in your last bullet there that there's maybe an
adequate level or shows that your people have the appropriate level of
                                               experience?
MR. RULAND:  A fire protection inspector for the NRC is not like any
other fire protection inspector anywhere else.  We're protecting
equipment, by and large, in addition to people.  So there's a whole
           different set of skills that's really required.
So what we're trying to do is make sure we always have two, not one, or
maybe -- and in our case, we have three people that are not fire
protection engineers, and I will acknowledge that, but these folks go on
-- they all stay abreast of the issues, they go to the different
workshops, and I think you'll see that in this region, more Region I
folks go to these workshops, stay plugged in to headquarters, than maybe
                                       our compatriots do.
I think by and large, we're lucky, we're close, and we do a lot of that. 
And I think the more people I can get involved, the more people that I
can have that experience, if I lose one, because it doesn't take a lot
to do this when you spread it out over three years, if I lose one, I can
put somebody else in line and have -- always have an experienced cadre
                                                 of folks.
But it's really being locked in with headquarters that is, I think,
                                    really very important.
DR. POWERS:  You certainly recognize one of the key features of the fire
protection within the NRC context is it's different than anything else. 
Most fire protection engineers are worried about people getting out of
buildings, protect people.  We're protecting equipment and the ability
                                   to shut down the plant.
And in that community of people, the way you become a good fire
inspection engineer is having to see lots and lots and lots of
situations, and now we're talking about moving toward a more
                          risk-informed regulatory scheme.
We're going to have team inspections, preceded by an SRA-produced risk
results report.  But we already see old things we don't like about the
IPEEEs and the question is, how do we do this, in this context, when you
 don't have the tools that are going to allow us to do it.
MR. RULAND:  Good question.  I wish I had an answer, because I -- if I
had an answer, I could sell that answer to somebody.  They're working
right now, headquarters, on a significance determination process for
fire protection.  It's based, in part, on defense-in-depth.
 DR. POWERS:  And that's not bad, because when you look at
defense-in-depth in a fire context and you look at the structure of a
risk assessment, there is one-to-one lineup between defense-in-depth and
risk and fire protection is the only area that it exists.  There is a
tension in internal events between defense-in-depth and risk assessment
                    that doesn't exist in fire protection.
The problem is that risk assessment requires us to quantify some of
those things that we've been doing on engineering judgment in the past,
but we don't have the tools to quantify it.  That's the problem we're
                                                coming to.
And to truncate the effort, we're creating these screening technologies
and approaches that I don't think have a good theoretical basis to it
and I think it comes to a head when we look at screening based on
combustibles and we don't count cables as combustibles.  I mean, that's
                                                    crazy.
MR. RULAND:  One thing -- well, there is some good news, and I think I
                can hopefully end my presentation by this.
If you look at the new draft module, it doesn't blindly say go use the
IPEEE.  What it does say is look at what you looked at last time, look
at the inspection record, look at the licensee's performance, and look
      at risk, and then come up with your inspection plan.
There is even an element where, you know, don't necessarily duplicate
what you did the last time.  And by its very nature, when you do that, I
think you can get away from being tied into a blind, you know, let's
                      follow what the risk meter tells me.
So I'm very encouraged.  I think this is the right thing to do in fire
                                 protection, by and large.
DR. POWERS:  I think we both agree that the FPFI approach that they
develop is a very good way to look at fire protection, because it's
soup-to-nuts and you don't have to do it every year on a return basis. 
You do it often enough to make sure the design basis is being preserved,
and that's why you do it.  But when you need -- and things work pretty
well as long as you've got Pat Madden, but I guarantee you he's going to
get tired of doing this after a while, you know, when he's 85.
                                      MR. RULAND:  I know.
  DR. POWERS:  He's going to run out of gas on this stuff.
MR. RULAND:  And it's -- I mean, I did -- the whole idea about staffing
this is very, very critical and all I'm trying to say is I recognize how
critical having the right people out there are and I think we're trying
to develop some of those right people here in this region, and time will
                                 tell if we're successful.
DR. POWERS:  Speaking of time, Bill's got us back on schedule.
                 MR. SINGH:  He's got us back on schedule.
                                       MR. RULAND:  I did?
DR. POWERS:  You did.  I thought that the 56K baud Screnci was the one
                                  that got us on schedule.
MR. BLOUGH:  Do you listen to KYW Radio?  Diane used to work for KYW
                                                    Radio.
                                MR. RULAND:  Yes, she did.
MR. BLOUGH:  And they tell everything that's going on in the world in 24
      minutes.  So she's actually slowed down quite a bit.
MR. RULAND:  Steve mentioned, well, she's probably not used to -- you
know, where's the teletypewriter in the background.  If you listen to
KWY, they have this teletypewriter in the background.  But, yes, that
                            was a KWY performance of hers.
DR. POWERS:  In that case, we can take a break until 2:05.
                                                 [Recess.]
DR. POWERS:  We'll come back into session and, and we're going to turn
to the senior reactor analyst program we've been deferring question
after question, all, as promised, Tom is going to answer for us.
MR. SHEDLOSKY:  Thank you very much.  That's quite a challenge.  My name
is Tom Shedlosky.  I'm one of the two senior reactor analysts in the
region.  Jim Trapp is presently attending a workshop on the new
oversight process.  Jim and I have been attending all of those workshops
and gathering information for ourselves and helping to participate in
                                          those workshops.
   Jim and I have been in this job for almost four years. 
Organizationally, there's only two of us, there's quite a bit to do, as
                you'll see going through our presentation.
We have roughly divided the reactors, informally divided the reactors
between us, each of us backing up the other 100 percent on any problems
           that occur when one of us is out of the office.
              DR. POWERS:  Do you have a PWR/BWR division?
MR. SHEDLOSKY:  Basically, we did that.  Jim had been reactor engineer
at a PWR.  I did a lot of BWR startup work, back when that was popular;
however, had operated a PWR.  So we've divided along reactor types, but
                        we back each other up 100 percent.
The first slide is an overview of my presentation and I tried to capture
our various functions, that being to support regulatory decisions, with
risk perspective, to evaluate events and inspection findings, to provide
insights for planning to support the new oversight program, and also try
to enumerate some initiatives that we have taken that support those
                                       first four bullets.
DR. POWERS:  And I'd like to amplify your presentation or where you have
an opportunity in the presentation, tell us about what your perception
of the quality of the tools you have for doing the job and the tools you
think you would like to have to do the job the way you would like to do
                                                       it.
                     MR. SHEDLOSKY:  I have included that.
                                        DR. POWERS:  Good.
MR. SHEDLOSKY:  We have been assisting regional management in making
regulatory decisions.  We have been providing written risk assessments
for notices of enforcement discretion, NOED.  These events occur when
there is a conflict between the availability of equipment, possibly
because of an extended maintenance period, a conflict of that
                     availability with license conditions.
NOEDs have also come up on the occasion when licensees have missed a
regulatory requirement and are in conflict with a deterministic
                                   regulatory requirement.
We've also supported the enforcement process in providing a written risk
assessment to the enforcement panel, which is considering the direction
to go in with an enforcement action.  This occurs prior to the formal
                enforcement conference with the utilities.
We have been participating in the assessment process, both the process
that was in place for the old deterministic inspection program and we're
gearing up to participate in the assessment process for the new
  oversight process, the new risk-based oversight process.
That assessment process has taken on a particular form, of course, where
the performance indicator history and the risk-significant inspection
findings drive the assessment process in a more mechanical process than
                                       it happened before.
We have been involved in review of the new oversight program from the
very beginning.  Hub Miller has kept us up-to-date on the developments
in that program over the last going on a year now and have commented on
both the program and the SDP process, the significance determination
                                                  process.
We have tried to -- there was a discussion a little earlier on the
presentation of risk assessments.  We have tried to come up with a
format for a plain language risk assessment; basically, discussing the
background of the event or the issue and then formulating a risk
perspective without -- to allow the reader to come to a conclusion or
see where we come to a conclusion without getting involved in a lot of
cut set information.  That information is available as background
                                              information.
When we -- in evaluating events, we look at a wide scope of a source of
information.  The NRC publishes the daily events on the internet and
           also internally on our servers.  We scan those.
The region holds an 8:00 meeting, a morning meeting, a plant status
meeting.  We quite often take events from either one of those two
sources and start to work on them in short time that day.  We are
contacted very frequently by the inspectors, both resident inspectors
and region-based inspectors, and a lot of the residents have begun to
take PRA courses.  So they're interested in the process.  They want to
get involved and get our perspective, either look at things themselves
and then get our perspectives or get our perspectives directly.  So
there's a -- it's a very informal process to get us on the phone or by
                         e-mail to discuss various issues.
We receive management referrals based on events that cross their desks. 
Also, inspection findings or events that happen out of the region.  And,
finally, we review the licensee event reports, which are, of course, the
first -- they are the input to the NRC's accident sequence precursor
                                                  program.
We expect to be receiving the outputs of the SDP process as the new
oversight process takes hold in the field.  More inspectors will be
performing the significance determination process themselves and then
turning over to us what's referred to as the level three analysis, where
something has moved out of the green area or the person wants a second
                                   check of that analysis.
The tools that we are using, we try to use the appropriate tool,
depending on the type of issue that we're looking at.  We do pay
attention to the licensee-supplied information, the information that's
                         reflected in their IPE and IPEEE.
We have collected the risk achievement worth data that was developed by
licensees for the maintenance rule.  That data is typically newer than
the IPE data.  The IPEs are based on analysis that's going on ten years
old now, that typically was run with models that are older.
The utilities, in a lot of cases, have switched to different software,
have upgraded their models, have redone initiating event frequencies and
equipment failure frequencies.  So a lot of that data is reflected in
risk achievement worth data that they have developed and are maintaining
                                 for the maintenance rule.
We may use that data for a specific hand calculation.  We also use the
NRC's simplified analysis risk models, the SPAR models.  Currently, Rev.
2 QA is the current version of that model.  The Rev. 3 models are under
development and I'd like to talk about that a little later under
initiatives, because we've been interacting with Research in a review of
                                   the level three models.
DR. POWERS:  Well, I guess the question that comes up when you talk
about the SPAR models, are they sufficiently detailed to allow you to do
                                                  the job.
MR. SHEDLOSKY:  The level two models -- excuse me -- the Rev. 2 QA model
may be, in some cases, detailed enough.  We have participated in a QA
review on-site with Research at Calvert Cliffs and Millstone Unit 2 on
two of the Rev. 3 models, and those models are much more detailed.
They will provide us with support systems, initiating events for loss of
support systems, large and medium break LOCAs, in addition to the small
break LOCA that's in the Rev. 2 model, and also uncertainty data on a
                                  lot of the basic events.
 I believe that that's going to be a big improvement in --
        DR. POWERS:  Do they have the data on any of this?
MR. SHEDLOSKY:  Yes.  Yes, they do.  There's been quite an effort made
by Research and the people at INEL to gather data from the utilities,
and we're doing this by visiting the site and going through each event
tree and each fault tree with the resident inspectors, if they are
available, and also with the utility people, the PRA staff, if they are
available, and comparing a lot of the basic assumptions in the SPAR
                  models with the utility's current model.
And there's a lot of subtleties to swing equipment, to dominant events
that are being analyzed very carefully by the authors of the Rev. 3
models to see that we have a much better tool than we've had to date
                                   with the Rev. 2 models.
              I believe that it will be a big improvement.
Just to digress a little bit, we really don't have a tool, a good tool,
as you had pointed out, for fire protection, an analysis tool.
                         DR. POWERS:  What do you use now?
MR. SHEDLOSKY:  Right now, we're using the data from the IPEEE as
                                              appropriate.
              DR. POWERS:  You don't even try to use five?
MR. SHEDLOSKY:  No.  We don't use five directly.  We've walked through
-- I know Jim and I have walked through the utility analysis of five
within the context of the IPEEE.  I have used the NRR's -- the PRA
branch within NRR is developing a fire risk tool that Bill had
referenced, and I had used this in looking at a cable wrap, a failure to
cable wrap issue with Limerick, and you basically enter the model with a
criteria for how much equipment is outside of the fire areas that you're
looking at, how much safe shutdown equipment is outside of that area,
and, therefore, would remain available if there was a problem within a
                                     particular fire area.
You look at the barrier within that fire area, separating it from the
next area, and you look at the quality of the automatic detection and
suppression systems, as well as the manual detection and suppression
                                                  systems.
So it follows a lot of the -- it's a conservative modeling or screening
tool.  It follows some of the processes that are used in five.  It
basically, though, assumes, conservatively, that any equipment within a
                                        fire zone is lost.
DR. POWERS:  And people say that's conservative, and I'm not sure that
                                       it is conservative.
   MR. SHEDLOSKY:  Well, it's more conservative than five.
DR. POWERS:  Yes.  The problem with fire is it's no longer you have the
equipment or you don't have the equipment.  There is an intermediate
stage which can be much worse, which is the equipment functions, but it
functions badly.  It does things that you didn't -- you would not want
      it to do, because it exasperates problems elsewhere.
That's one of the hardest problems in modeling fires that there is, is
 that all the intermediate states count.  It's not on-off.
MR. SHEDLOSKY:  Well, as I said, I've used it in looking at an issue at
Limerick.  Compared to the Limerick analysis that was done several
months ahead of time for the very same issue, it is conservative. 
Limerick had -- PECO had basically screened using five to -- for an
issue involving a safe shutdown diesel, a cable wrap issue.
DR. POWERS:  Is the reason that the screen is simply labor-saving?
                  MR. SHEDLOSKY:  In our case, with the --
                           DR. POWERS:  In anybody's case.
                            MR. SHEDLOSKY:  The PRA model?
                                         DR. POWERS:  Yes.
MR. SHEDLOSKY:  Well, our model -- our screening tool certainly is --
it's a simple process to use.  It still is in the developmental stages,
although we've had draft versions available for several months now.  It
doesn't require an in-depth analysis of fire propagation within a zone,
so it certainly is much quicker and easier to use.  You basically
analyze the equipment within the zone and the equipment that may be lost
in an adjacent zone, depending on the condition of the three-hour
                                                  barrier.
And similar to the SDP process, it comes out and gives you a very high,
      high, medium and low ranking for that fire scenario.
So the process is useful.  I look forward to using it in the future and
                                      see how it develops.
As far as tools, we also do not have a process to do level two PRA work. 
That -- I understand that that's underway in other work that's going on
                      with Research and their contractors.
For -- and without a level two analysis, of course, the systems like
containment spray, which protect -- effect containment integrity in a
   severe accident, don't fall into place in the analysis.
For shutdown models, there was some discussion a little earlier about
shutdown models.  Again, we have no computer tools for shutdown models. 
I understand that the SDP process is being developed for shutdown and
                                                low power.
In the past, where we have taken a look at a couple events at shutdown
at Region I plants, we basically consulted a shutdown PRA, the Seabrook
PRA, and constructed an event tree similar to, using theirs as a model,
and developed that event tree using that as a basic outline.
DR. POWERS:  What happens if a resident call you up and says, gee, Tom,
these guys have done the wrong calculation for this configuration,
they're going to go into shutdown and they've got two orange categories
and three greens and a white, and I don't think it's right.  I think
they should have -- I think this is a red configuration, can you give me
                                       any advice on this.
MR. SHEDLOSKY:  I think we would talk through specifically where those
oranges were, the oranges and white were, and also talk about the
                    utility's PRA analysis for the outage.
Most utilities are having their PRA group review the outage schedule
in-depth prior to the outage to do a risk analysis of the various
                                     timeframe the outage.
Basically, the highest risk is prior to flooding up the reactor cavity,
                  normally at the front end of the outage.
    DR. POWERS:  You don't have a tool equivalent to ORAM.
                                MR. SHEDLOSKY:  Pardon me?
DR. POWERS:  You don't have a tool equivalent to ORAM to do an
                                   independent evaluation.
MR. SHEDLOSKY:  No, I don't.  No.  No, we don't.  We would interact with
the utility.  Region IV SRAs have an initiative where they are visiting
every site prior to an outage and going through the outage schedule with
the PRA group, doing an independent review and going through it, through
the schedule then with the PRA group to highlight risk peaks during the
outage, and it's a good initiative and we would like to try to do some
                                       of that here, also.
Continuing on with the tools that we are using, we -- as I said, we have
-- we've been using the SDP process, which is still under development
and shaking out that process, becoming more familiar with it.  We've
                                used the fire SDP process.
We also have an initiative where we have been visiting the sites, we get
into this a little bit, but we have made contact with the utility PRA
staff and as appropriate, we contact the utility PRA staff to try to
determine their feelings and the results of their analysis for an event
                                or a particular condition.
And we've found that the people at the utilities have been very
cooperative.  They want to basically share in what they know of a
particular event, and sometimes a phone call has encouraged them to get
involved in something that's happened at the site that they may not have
                         been aware of prior to that call.
I mentioned the Rev. 3 models and the work going on to develop those
models.  We have tried to involve the utility PRA people during the site
visit, ideally about a three-day visit on-site, and demonstrate the
Sapphire software and the Rev. 2 model and we found that a lot of
utilities are interested in that software and interested in obtaining
      the Rev. 3 models when those are available publicly.
DR. LARKINS:  The utility does the analysis of the event, PRA
assessment, and you do an assessment.  Is there reasonable consistency
                                      between the results?
MR. SHEDLOSKY:  We try to find out why there are differences and pursue
that.  It's a good quality check that I use.  I don't want their
analysis to drive my decisions, but on the other hand, I don't want to
be out in left field if obviously an event is going in the other
                                                direction.
The utilities, of course, have the advantage of being able to exercise a
large model and get hopefully a very accurate result in quantifying a
particular event, although quite often a utility will run an off-line
calculation in looking at the probabilities and frequencies of bounding
conditions instead of running the model.  It depends on the event.
But we found that interaction to be very, very helpful and it does keep
                               us -- our thinking in line.
        DR. LARKINS:  Are they usually consistent results?
MR. SHEDLOSKY:  They are not as far as off many times as we would
suspect in looking -- in using the Rev. 2 models.  The Rev. 2 model is
not as conservative, typically is not as conservative as the utility's
models, just because of the number of cut sets that generates are far
less than the utility's large-scale models.  The lack of additional
initiating events, medium and large break LOCAs, and loss of support
                                                  systems.
The Rev. 3 models are probably approaching the detail of the NUREG-1150
models and hopefully will give us results that are fairly uniform to the
utility's models.  And when comparing the results of the models, we're
always looking at the delta CDF as opposed to an absolute value of CDF,
because their initiating event frequencies and basic event data is
gathered from different sources in the SPAR models as opposed to the
                               utility's full-scale model.
You wouldn't expect that the CDFs be in line, the absolute values of
CDF.  But the delta CDF fall in line surprisingly frequently and if
they're out of line, in at least one case, we found -- one or two cases,
we found errors in the Rev. 2 models, gotten back to the contractor,
INEL, through Research, and made corrections to the models right here. 
We've gone into the model in Sapphire and changed that and corrected
                                     some modeling errors.
In looking at the quality and uniformity of our evaluations, we have
been filing the risk evaluations on a Region I server.  It's available
to anyone in the region to access.  We also have contacts within the PRA
branch in NRR.  They have an operations support team which either we can
contact or our regional management can contact if we're not available
                                          for some reason.
NRR has made direct contacts available for new issues, like the SDP
process and the fire SDP process, and we can bring problems and
questions directly to those groups, and they've been very responsive in
    getting back to us and working through these problems.
For instance, the issue at Limerick on the fire SDP, I spent a lot of
time with the author of that process and going through the particular
analysis.  We have biweekly conference calls.  Those involve, of course,
the SRAs in the regions and in headquarters, personnel from NRR, and
     Research also participates in those conference calls.
DR. POWERS:  Who participates in Research, is it always the same person
                                    or somebody different?
MR. SHEDLOSKY:  Ed Roderick, who has been -- and, of course, the ex-AEOD
organization is very active in those phone calls, also.  We share recent
analysis, recent problems, events of interest, and then also distribute
the risk analysis, either analysis that's done by traditional methods,
computer analysis, and lately there's been a lot of sharing the results
        of the SDP process, as that process is developing.
We have counterpart meetings once or twice a year, depending on activity
levels and schedules, although the workshops that have been along the
line of the new oversight -- for supporting the new oversight process
have given us an opportunity to get together and to share ideas.
                  MR. MILLER:  Tom, what slide are you on?
MR. SHEDLOSKY:  We're moving on now to the providing risk insights for
                                      inspection planning.
                 MR. MILLER:  Okay.  Why don't we -- okay.
MR. SHEDLOSKY:  I think I've covered most of the information on that. 
We are looking at either recommending inspection of equipment or
operations to avoid initiating events for equipment failures or looking
      at a static situation in planning a team inspection.
Moving on to the next slide.  The new oversight program, we have been
supporting the staff in the use of the SDP process and calculation of
performance indicators.  We have -- the level three analysis that falls
out of the SDP process would come to us and we've had a few of these
analyses that we have been working through.  We've also been supporting
                                   the assessment process.
Initiatives that we've been involved in, as I said, we've been assisting
Research in the review of the level three SPAR models.  We have a
process to coordinate significant issues within the region.  We have an
outreach effort to contact the utility PRA staff, visit with them in
their office, gather up information from their latest computer models.
We have an initiative to visit the resident sites and to give them some
of the PRA tools.  Many of them have been to training classes and they
appreciate having the software and the computer model available on their
                                    own PCs in the office.
In the event of an issue, we can then walk through the computer analysis
on the phone and take them through a real analysis from here.
We have been testing the SDP and the fire SDP process in the region, and
these initiatives really have come about -- they're issues that we find
                   important to help us do our job better.
MR. MILLER:  I didn't like the idea of an SRA position six years ago, I
didn't support it, and, of course, I supported it when it was funded,
but I have been proved, I think, wrong.  I mean, I think this position
    has been invaluable in making this more than rhetoric.
DR. POWERS:  I would think your reaction is probably typical of most of
the regional managers, that there was a suspicion about the SRA position
and then they found out, first of all, they picked excellent people to
    be SRAs and the SRAs are earning their keep very well.
What I'm just continually distressed by is when I compare the levels of
computer technology available to our SRAs, with what the licensee, who
was also very suspicious about risk analyses when they were first
exposed to them, but he's buying into this and he's equipping his people
with very superior technology and it's getting better all the time, and
I think we're getting into a mismatch and the computational tools that
we have available, the level of detail, the ability to do uncertainty
analysis, the ability to respond to unusual configurations or unusual
                          situations, it's just putting --
I mean, people compensate by -- they're smart guys, they can
interpolate, they can extrapolate and things like that, but they don't
have -- so that it's routine.  And the idea is the SRA is the leading
edge of the wave, that eventually this technology now has to be
distributed out to the residents and they're going to be called on to do
                                       more of this stuff.
And if you don't have the kinds of things that you're going to be
familiar -- going to be comfortable to give to the residents, without a
lot of hand-holding here, and I think this is just something that the
                     agency is going to have to recognize.
They're getting ahead of their headlights as far as some of this
technology they're making available here.  I'm just getting very
                                     concerned about this.
MR. MILLER:  There are a couple of things.  First of all, we look to
                  these individuals to not only run the --
                                        DR. POWERS:  Sure.
MR. MILLER:  -- run the numbers, as far as to be informing the rest of
us on the limitations and the uncertainties and the limits really of
this, so that we're not, any of us, making decisions based upon what
    appears to be a real simple neat answer for something.
I think that there is real danger in that.  There is another issue
that's out there, and we are predicating much of our programs on the
information that comes from licensee assessments, as well as our own. 
There are still -- there's till wide variation and variability, in my
mind, in the quality of the work that's being done by licensees who are
                                              ahead of us.
I'll give you an example, and I've asked my SRAs to look at this. 
Calvert Cliffs stands out very high.  Now, when I've been to Calvert
Cliffs and I've pressed them on this, I got an interesting answer, and
it is partly that we think our assumptions are more realistic than other
plants.  We've had a number of situations arise, enforcement being one
of them, where they stand out and I'm always nagged a bit by is this a
real situation or are there real risks that Calvert Cliffs is higher.
I'm not talking just a the baseline, but even when you look at the
delta; and how much of this is punishment for good behavior and how much
                                     of this is real risk.
We deal with this and it's -- I hope we're not out ahead of the
headlights and I think -- I look, again, at the SRA function as a
function that gives is good insights, just for the analysis that they
do, but we just simply need this to help know where the limits are, if
                                  that makes sense to you.
DR. POWERS:  I know that I have the privilege, or the pain, of working
with some of the true PRA fanatics in this world, the real believers,
and the first thing they'll tell you is, no, do not take the numbers and
                    act on them without interpreting them.
The human component of thinking and analyzing these numbers has not
           disappeared just because you can generate them.
The other thing they will tell you is the PRA can do a lot of
integration that you can't do in your head, but it's very dependent on
sets of assumptions, success criteria being the number one assumption
that gets made, that dictate what the outcome of these numbers are.
And it doesn't take long to find out that you can pick the critical
numbers in a PRA fairly easily and if you give me dictatorial control
over a small set of them, I'll get you the answer you want; you just let
 me know, if you want ten-to-the-minus-ninth, I'll get you
ten-to-the-minus-ninth, if you give me control over probably less than a
handful of numbers that are hidden in the assumptions, and usually
                            fairly well hidden in the PRA.
So assumptions do make a difference.  I think that's why you see a lot
of emphasis in PRA land in what changes when you make this change to the
plant, rather than paying attention to the bottom line number.
I'm reminded all the time, the bottom line number, we worked damn hard
to get it, so you can't ignore it.  But it is a real problem in knowing
whether this, as you say, punishment for good behavior or real risk;
similarly, the ten-to-the-minus-ninth what we get at Susquehanna; well,
I'm pretty sure we're all convinced that that's a figment of the
                                              assumptions.
But there are real issues that have to be addressed yet in this process
and the problem that I think we're going to get into is when you get
into a disagreement with some of your licensees in risk space and he can
outgun you; he can come in and say, well, I did this analysis with this
super computer code and you don't have the ability to do your
      independent analysis, you're going to get outgunned.
MR. BLOUGH:  Right.  At the resident level, things are improving with
the SRAs available and improved training for the residents, but we're
still relying on the licensees' tools.  And when there is a clear case,
                                           we can prevail.
For example, at Ginna, they were doing work on underground cables
associated with off-site power and the licensee has a risk monitor and
the answer from the risk monitor was no change, zero change in core
damage frequency, and the resident said, no, that can't be.
                                    DR. POWERS:  Can't be.
MR. BLOUGH:  It can't be.  So it's an obvious case and, of course, the
licensee went back and this activity just somehow in the modeling just
                                  wasn't properly modeled.
MR. SHEDLOSKY:  In the risk monitor.  As I understand it, the full PRA
model did accommodate off-site power, but the risk monitor had
                                            simplified it.
MR. BLOUGH:  Here is a case that's so clear, if we're a little further
along in our own tool.  I think we're making good progress, but if we're
further along in the less clear cases, we may be could be able to get to
                                       those a bit better.
MR. SHEDLOSKY:  And just a word of explanation.  The Calvert model, in
discussing their philosophy with the PRA staff last year, they seemed to
embark on a program to identify additional scenarios that do account for
a large amount of risk at Calvert Cliffs that other utilities may not
                                       have accounted for.
DR. POWERS:  Well, we see that a lot and we see this a lot with the
induced station blackout plans, that you're going to leave scenarios
out, and your risk drops down pretty good when you leave a few of the
               key scenarios out.  That's a real good one.
That's why independent analysis becomes so important in this area and
                          standards become very important.
MR. MILLER:  As I said, we have a conclusion here, I have become a true
                     believer at least in the SRA program.
DR. POWERS:  However, I think the agency can be congratulated on a
universally high quality of SRAs in the regions.  I'm just always
impressed with the kind of people we have got in those positions and how
much of an impact they're having on the rubber meeting the road product
that comes out of the region.  It's very exciting, actually.
MR. MILLER:  A terribly interesting job and one that I aspire to,
                                                  frankly.
DR. POWERS:  You haven't been an SRA?  You've been everywhere.
MR. MILLER:  Thank you.  The next area is the area of the new program,
                        if we're ready to move on to that.
                                       DR. POWERS:  Right.
                 MR. MILLER:  You've been briefed on this.
           DR. POWERS:  We've lived with this for a while.
MR. MILLER:  In fact, I'm just told -- in fact, Randy handed me a copy
over noontime, the SRM has finally been released, I guess.  Maybe you've
gotten a copy of it previously, but the Commission has finally sent down
                         their action on this new program.
So it continues to evolve and we've worked exceedingly hard on making
this program come out at the right spot and also very hard at doing
things to, when it's all done, be in a position to implement it in a
           timely way and to implement it with conviction.
I think Randy will walk us through a number of slides here, and then
                  I'll just kind of kibitz as we go along.
MR. BLOUGH:  I'm not sure if I'll walk you through in great detail of
these slides, so what I'll probably do is kind of give you the bottom
line for the slide, some of the bases on the slide, and I have other
amplification and, in fact, other details that I could provide, but I
                           probably won't, unless you ask.
We are embarking on a major change here and like I said, we're
endeavoring to have extensive field involvement in all aspects of
program development and evaluation.  On the slide, there are just some
bullets.  We have been involved really in everything and we've attempted
to put some of our best people on this program, and we have endeavored
             to comment every step along the way, as well.
Having said that, we are moving right along and things are moving
rapidly and develop as we speak.  I mentioned this morning that we saw
your June 10 letter and now today we're reading the June 18 SRM from the
Commission on the program, which endorses going ahead with the pilot,
says that April of 2000 appears a reasonable timeframe for full
implementation, and then provides a lot of amplifying guidance, some of
which would be very significant for us to digest based on where we are
                                                right now.
                                               Next slide.
We're -- change management has been a challenge for the agency in the
past and this is a very big change.  We've done a lot in the region to
really help people and help ourselves with the emotional, the
intellectual and the practical aspects of leading change and being
involved in change and living through change and helping change to be
                                               meaningful.
The initiatives I have listed on the slide there are incomplete.  There
are additional ones.  In addition, we've been -- most of these
initiatives, but not all, are linked to what -- are linked to what's
going on in headquarters and the things we're involved with in
headquarters.  And the initiatives we have beyond what's linked to
headquarters are all in concert and consistent with the overall program.
We've had a lot of senior management support, they have been very much
                                                 involved.
MR. MILLER:  I've got a list here, in fact, just to give this to you. 
You probably have more paper than you need, but it's all of the
different interaction sessions that we've had starting back in July of
last year, when we had an all-staff meeting, where we had, in fact, a
long discussion on why are we changing, do we need to change, and we
were working to get beyond the stage where the answer to that question
was, well, Senator Domenici and the Senate's proposed cuts have caused
us to change to discussing the fact that the industry's performance has
                                   improved significantly.
That our old programs, as good as they were at the time, for the time,
and they grew up in an ad hoc fashion, and so stepping back and looking
at things from fundamental principles is bound to lead to a better
        place, we started to get better answers like that.
So we worked initially very hard and this is a list of things that we
did early on to -- forgetting about what the exact program is, is there
a real need here to change and why, and I think fortunately it didn't
take long before the inspectors were saying enough of that, please stop
                   talking about it, let's get on with it.
MR. BLOUGH:  Yes.  Throughout this explaining the reason for change,
we've been careful not to denigrate what we've done in the past.  We
        need for change is linked in many ways to success.
                                       Next slide, please.
We're actually just into the pilot now.  We have three pilot sites in
Region I.  We got an extra one, because Salem and Hope Creek collocated
             sites under the same licensee, it made sense.
We have complete manual -- or nearly complete manual, I should say, for
the guidance for the pilot program and we've got really extraordinary
measures in place to communicate, coordinate and evaluate and ensure
good feedback during the program; in part, recognizing that during the
pilot, some of these inspection procedures may be done only once per
                                  region during the pilot.
We just completed, last week, the first specialist inspections.  We have
several branch chiefs here who can amplify, if you have questions about
what happened during the inspections we've done so far, and we've got a
mechanism in place to coordinate with the other regions and with
headquarters throughout this.  So it's a rapid pace of change and we've
got really extraordinary measures to coordinate, but given the magnitude
                    of what we're doing, that's necessary.
                                               Next slide.
This slide continues to talk about the measures to apply oversight of
the program and they are many faceted.  On the previous slide, you saw
pilot oversight panel.  This is the region's effort to provide some
augmented oversight to the inspectors and first line management's effort
to make sure this program is done well and we learn everything we can
                                                  from it.
There's a number of agency-wide efforts, including the SDP panel that
Glen Meyer and our other SRA, Jim Trapp, are on.  The Regional
      Administrator is very involved agency-wide actually.
MR. MILLER:  Unfortunately, Jim couldn't be here today, but we neglected
to mention him at the beginning.  He is figured very prominently in this
process and we're dedicating a lot of his time to make sure that we're
                              staying close to this thing.
MR. BLOUGH:  NRR has assigned points of contact for each region and
subject matter experts in each inspectable area, and also within NRR,
the various specialist counterparts are in emergency planning, security,
health physics, are devoting a lot of effort to this and there is, I
think, now a better link between inspection program branch and
headquarters and the specialists.  I think I mentioned that earlier.
                                               Next slide.
                        DR. POWERS:  Cross-cutting issues.
MR. BLOUGH:  Right.  These slides talk about issues and change
management challenges.  This is a huge change management challenge.  We
have listed some of the issues there.  Certainly, what we do within this
risk-informed baseline inspection program is key to it all.  We are
trying that out now and trying to get a good evaluation of the scope and
size as well as a good tryout of really what it takes in terms of
                resources to conduct the baseline program.
That's been a matter of some controversy, because early on, there were
first-guess estimates put out for what it would take to do the program. 
Those estimates took on a life of their own and the pilot program is
really to try to -- the SRM from the Commission reemphasizes that to us,
                 that we've got to find out what it takes.
A challenge as part of that is to divide, out of the startup process,
from what we think it will take on an ongoing basis, and we're working
                             pretty hard on that, as well.
And then, of course, the integration of performance indicators and the
verification of performance indicators is also a challenge for us.  It's
new and, just for one, verifying the performance indicators will be a
                              fairly unique effort for us.
                                   Then, of course, the --
DR. POWERS:  I'll be darned if I don't think you're going to verify the
ability of the performance indicators to accurately assess the
licensee's performance.  I'll be intrigued to see how that is -- how you
                     derive any confidence in that at all.
MR. MILLER:  I think this goes to the question of what we expect to
accomplish in the pilot and what it is, frankly, it's going to take a
longer time to make judgments about and ultimately it's performance of
                the industry and of licensees that counts.
And the proof will be in how they perform over time and I think that
most of us feel like, especially with the extension that the Commission
has given us in the pilot, that there is a lot we can learn in the
pilot.  I feel as though we will learn enough to know whether we should
   take that next step or not and broaden it to all sites.
But over the long run, you know, what's the right level of -- exact
right level of inspection and are these indicators the right set of
indicators will best be judged by a period of sustained performance
                                              observation.
DR. POWERS:  I think you're right.  I think trying to say that these
performance indicators give you the same impression that you got from
                           the past, it's a futile effort.
The question is, does it give you insight on the plant and you're going
to see how performance is holding steady, improving or degrading with
sufficient advance notice that you catch it before it becomes a problem.
                                       MR. MILLER:  Right.
DR. POWERS:  That's the telling and that's going to be decided in 36
                                          months, not six.
MR. BLOUGH:  One of the things we really need to do in this process is
we're doing the pilot program now and as we understand it, we'll gather
data for six months, the pilot program.  After the six months is over,
we'll, of course, continue in that program with the pilot sites, but
we'll also be in the process of rolling up thus far what we've learned
thus far, presenting it to the Commission and making a decision on full
                                           implementation.
But from that point on, we still need to be incorporating the lessons
learned from the pilot sites and we need to just be lighter -- I call it
lighter on our feet in the future in terms of being better connected and
making continuous improvement to the program and getting consensus and
    moving forward on improvements when they're suggested.
                                               Next slide.
Cross-cutting issues is, of course, a big issue.  Only the corrective
action program is inspected directly.  The theory is that the
safety-conscious work environment and human performance would show up
elsewhere in the program without being directly inspected, and this is a
major change for us and it's an area that still has to be studied, and
the SRM that we just read talks about additional work needing to be done
to look at the aggregation of a number of smaller issues and that sort
                                                 of thing.
But this is a challenge both from deciding how well it works during the
pilot program and also coming to grips with it in terms of change
                                               management.
MR. MILLER:  If I could give one example.  At Indian Point 2, we talked
at the very beginning about how one of the things, me personally, I know
I do and I believe my colleagues feel, as well, something we're proud of
is our efforts to bring to light issues at that station and so that the
                           licensee could deal with those.
And one of the things that we saw was a pattern of engineers going out
and coming up with the right answer, the right answer being one that
tended to not impact on operations.  I think back just to one example in
particular and it had to do with testing problems that the licensee was
experiencing with discharge throttle valves on aux feed, and they were
 not opening.  They were having trouble with those valves.
The rationalization at the time was, well, we're doing this testing
without pressure on the discharge of the pump and we're certain that if
there is that discharge pressure there, that those valves will stroke
                                                 properly.
They would -- after some manipulation, would retest the vales and, of
    course, after some manipulation, they would work fine.
We saw numerous instances of this.  Now, that's a fairly significant
system at that plant.  It's the highest risk-significant system at that
plant.  But still, it wouldn't rise to the level of being a significant
-- an SDP hit.  But we saw examples, and I can give you numerous other
                   examples, and there is a pattern there.
Ultimately, the licensee came to this in their own SALP assessments and
saw this problem and has been working to deal with this and to change
  around that attitude to be more questioning and probing.
The issue in the new program is to what extent and how and when does NRC
interact with the licensee if we see a pattern like that.  Now, we're
expecting their corrective action program to pick up on it and it's very
appropriate to have.  One of the good things in this program is it puts
the spotlight where it ought to be, which is on their corrective action
                                                 programs.
But there still lies a question of how we, from our perspective, what
level do we interact with the licensee and how do we interact, where we
       see a pattern that we don't see them picking up on.
DR. POWERS:  We've had other people tell us exactly the same thing,
classic, sloppiness in handling radioactive resins.  You see a pattern
of that.  You say there is a radiation protection problem at this site
because I see it showing up and I think -- nothing very significant,
just sloppiness and what not, and in the past, you'd interact.  Now you
don't.  Am I risking a bigger problem because now I don't interact with
them over this sloppiness that I see and just poor planning and
                                       execution of a job.
It's the identical question that comes up over and over again and in the
past, we would have written something up on this, something would have
happened to it.  Now, we don't write up anything on it.  I mean, this
one didn't even make an NCV.  But it's a question, the way NRC interacts
and brings things to the licensee's attention that they have in the
    past, and maybe the licensee has grown to depend upon.
One way to look at inspection is it's a lot of free consulting or a lot
of consulting you pay for, but you have to pay for it anyway, one way or
                                                the other.
MR. MILLER:  That's the good news in the new program.  It makes real
clear that the spotlight is on them and not on us, and I think there is
a lot of merit to upping the ante, if you will, on us to raise issues in
this area, to have a stronger burden of proof on us, because we -- I
                   think that the focus has to be on them.
MR. BLOUGH:  On the issues and challenges there, I think probably the
easiest one is the second bullet on that slide that talks about
development in the pilot of supplemental inspection procedures.
We have a baseline inspection program, but we need procedures that are
kind of a generic procedure that when a licensee gets white indicators,
for us to do inspection follow-up into their root cause analysis and
  corrective action, extended condition corrective action.
Likewise, we need procedures for a focused team inspection when
individual cornerstones turn yellow and then we also need basically a
diagnostic type procedure for red indicators, and those all are yet to
                                             be developed.
But we have assigned staff to work with the other regions and
headquarters and that work is going to happen this summer.  So that's
                    probably the easiest one on that list.
                           MR. MARR:  Can I say something?
                                        MR. BLOUGH:  Sure.
MR. MARR:  You brought up the corrective action process and the regions
have a strong voice in trying to get something addressed there, because
the headquarters people who were going to develop the supplemental
inspection procedures were going to develop them based strictly on color
                                                  changes.
Well, the corrective action inspection, it can't change to anything
other than green, because it's based on old issues.  The reason it kept
banging away is where we need -- we still have an opinion that we need
an inspection procedure to tell us or allow us to analyze their
corrective action process in greater detail than the baseline does if we
                                     see a weakness there.
                                        DR. POWERS:  Okay.
                                  MR. BLOUGH:  Next slide.
                         MR. MILLER:  That was Steve Marr.
MR. BLOUGH:  Continuing with the issues and change management
challenges.  The job of the inspector is changing and, in addition, the
type of expertise we need is changing a degree, as well, and in terms of
looking at the inspector profile, the resident inspectors' jobs have
changed in nature.  We also need to look at the inspection program as a
whole and look at the type of resources and talent we have.
The program includes a vertical sliced type engineering inspection and
the regions are talking now about what sort of expertise we have in the
various regions to meet that, and certainly the fire protection
inspection is a big issue, as well.  So these all need to be dealt with.
And as the Chairman said, we all recognize outreach to -- education,
involvement and outreach to the stakeholders is a big challenge
associated with this, and we're cognizant of that and we're doing maybe
more than we have done before, but we need to keep considering whether
we're doing enough.  Maybe we need to do more in that area.
As I said, we are into the pilot and we're now starting to wrestle with
the type of issues you come up with when you're trying to do a different
             inspection than what you've done in the past.
And with Salem, the first week in the new program, they had a problem
with the containment fan cooler units and they were looking for an
extension of the LCO, which means they needed a notice -- they were
going to request from us a notice of enforcement discretion.  Of course,
our procedure for that has been existing a long time and that prescribes
a fairly intrusive NRC involvement in what the licensee is doing, to see
if the review request is justified and what they're doing in the plant
is reasonable, they have a success path, and that sort of thing.
So we really dealt with that the first -- we dealt with the issue the
first week, but we're still kind of sorting out what it means for the
new program.  And I mentioned a couple other things we saw.  At Hope
Creek, they had service water out of service and we found some problems
with the LCO maintenance plan in terms of prescribing what they would do
to keep other -- to keep the risk profile from going worse.  But, of
course, there was, again, it was human performance and there was no --
they didn't actually take any additional equipment out of service or
violate any of their -- they didn't create any increased plant risk.
They just didn't have in place all the barriers that they should have,
   by their program, to keep it from going that direction.
So these are things we've made issues in the past, but they don't fit
real well into the significance determination process and they tend to
be human performance cross-cutting issues.  Likewise, at Fitzpatrick, we
were watching drills in the simulator and, of course, now you're in the
simulator, not the plant, and we saw a number of cases where instead of
going methodically through the emergency operating procedures, the
operators kind of jumped to the answer and they jumped to the right
              answer, but still they jumped to the answer.
But you know a lot about the new program, so you can see how we're
dealing with this within the context of the program.  It gives us all a
                      lot to think about and work through.
Do any of the branch chiefs want to add more about what we've seen in
                                         the pilot so far?
MR. MILLER:  You know, when this first came up and when this was being
developed by a number of people who were, I'll say, closeted, and that's
very close to the case, who were sequestered, a better term, in
Washington, and we had some of our people down there, they worked very
hard and then out came a product in a very short period of time, a
                phenomenal piece of work, just incredible.
And the first thing that, of course, caught people's eye are the
indicators and when one of the ones at the top was the number of scrams
and the first reaction of virtually everyone when they saw the
indicators, with 25 and ten and four, when people are experiencing one
or two these days per year, was one of, my gosh, we're going to be
completely marginalized and we're giving away the store and there were a
lot of reactions like that, which were clearly understandable and not
                                               surprising.
And one of the things that we did was, again, we encouraged people to
express those concerns and questions and early on I created this
document, and I'll give this to you.  It's kind of a one-pager, and it
is something I did at the beginning of the year, and it's called "things
                           to like about the new program."
And I genuinely feel that if you just look at it superficially, you
could come away with a concern that this is going to be an inappropriate
thing.  But it frees us up, it offers great potential, of course, to
free licensees up of things that just are not providing value.
And the second bullet, and I won't walk through all of these, but the
second bullet is an important one.  It creates a strong environment for
risk-informing our inspections, and we can say, as I have said for many
years, gee, we're informing our inspections with risk perspectives.
But it's very interesting to have some of your best inspectors who
participated in this go back and early on start to, in their mind,
inspect the way the new program would cause them to inspect and these
are, again, our best inspectors, who you would assume would be top in
terms of focusing on what's most important and being aggressive about
that, come back and say, you know, I will do things differently in the
future and not just because the program tells me to, but this will lead
to a different way of doing business, one that will really lead to us
                       more focusing on things that count.
                                       So that's positive.
One of the early reactions I got was from one of the person who has been
with Wayne toiling in the trenches in Connecticut on Millstone, and
after Millstone Unit 3 started up, there were a number of scrams and
several of us -- and at this point, I was back in the business of having
to explain this -- in the public, there was a certain outrage over this;
that, see, this proves that it was a bad decision to permit startup of
                                               this plant.
And I was in the curious position of, on one hand, sort of beating on
the licensee and pressuring the licensee to tell us what they were going
to do to improve this performance, and, in the next breath, explain why
                           it wasn't the end of the earth.
And one of the individuals, as I've mentioned, who was involved in this,
said, boy, I don't want to go to Connecticut with those numbers in the
performance indicators.  And my response was it's the opposite.  This
will help us -- either the Commission will defend those numbers in risk
                                       space or you can't.
If the Commission can defend those numbers, then it sure makes our job
easier as we go to Connecticut and we're pressing on the licensee and we
won't stop doing that, but it allows us to put into perspective what
      this means when there are three scrams at Millstone.
It is something that we're agitated about, but for the person on the
street who wants to understand where this lies, without understanding
cut sets or anything else, if 25 scrams is where it takes to get to be
having something that's really significant and now you're really worried
about safety and now you're talking about three, it just helps.
DR. POWERS:  It's the classic statement.  I don't know what a picocurie
                  is, but ten of them sounds like a bunch.
                                         MR. MILLER:  Yes.
DR. POWERS:  Okay.  I don't know what a scram -- the guy probably
doesn't know what a scram is, but if somebody tells him 25 is a bunch,
            then three is not.  That's what you're saying.
MR. MILLER:  Yes, and that's exactly what I'm saying.  So this -- and I
could go on about consistency and a number of other things.
We are enthusiastic about moving forward.  We think that this will lead
to overall better regulation; at the same time, we've raised issues here
not with the idea of telling you that this program is no good or that it
won't work, but rather these are the things that we think that the
agency and certainly we in the region will be focused on as we move
forward to make sure that we come out at the right spot on these issues.
DR. POWERS:  I mean, you have to understand, the history of the project
was exactly as you described it.  They spent, what, a year or a year and
a half really flailing around and then they got a direction, they
sequestered, they built this document, came out, it was fantastic,
                                          lovely document.
But it was done in a very short period of time, and when they were done,
the document says, we've left out lots of things.  And what you're
finding is lots of things that don't quite fit, that you don't see how
what you've done in the past, that seemed like a good thing, fits into
the new scenario and it doesn't tell you, it doesn't tell you to drop
it, it doesn't tell you to include it, it doesn't tell you what to do
                                                  with it.
And I think that the strategy here is, in fact, that they're trying to
implement quickly, and so they said let's go to the regions and try this
thing out and find all those things that we've left out, rather than
continuing to hold this document and keep rewriting it and try to dream
them up here, because we'll never get them.  We'll still have lots of
                                     things that come out.
So I don't think you need to apologize about finding these things and
raising the question, because I think they're looking -- I mean, that
was the whole strategy, was that those guys could have kept rewriting
that, like 50.59, just over and over and over again we're rewriting this
thing, until we get -- finally to wring everything out, it would have
taken a hundred years.  This way, they're putting it out for use
immediately and they're looking for people to say, whoa, what do I do
about this and what do I do about this, and maybe without an answer.
MR. MILLER:  And I think that that was a wise strategy overall for the
Commission to -- it was a fast-moving train and as Randy said, that we
commented as we went along, but honestly, a lot of the commentary was
put off until let's have this thing developed, let's get into the pilot
program, and that's where we would really be able to have a lot of the
meaningful sort of input and not to say that we didn't have an enormous
amount of input as it developed through, there are people sitting on
                                              this groups.
Let me give you an example, though, of a very, very tough area for us. 
It's easy to put these issues up here.  Let me go back to that first
one, Chris.  Go back to the first slides that Randy had, back three,
                                     scope of the program.
The inspectors, for every inspection, will have a questionnaire to fill
out and it has a number of questions, but there are two really important
                                                questions.
One is, tell me -- give me feedback on the performance of this thing as
it's written, and then there is another question, which is perhaps more
significant, much harder to answer, and that is, does this meet the
       intent of the broader process, of the cornerstones.
And what this is going to require is for us to be extremely astute here
in the region, that's why we're investing an enormous amount of time in
the management oversight and in the oversight group that's been formed,
as the inspectors come back, because to answer that question, one has to
understand the overall framework, starting with the cornerstones,
working and understanding the performance indicators, understanding the
broader objectives beyond just the objective of this narrow inspection
                                                procedure.
Because we won't be able to answer that question unless we take that
broader perspective, and that's a very challenging and difficult thing
for us and it's one of the most significant -- but it's one of the most
significant questions we have to answer, because the question of what's
an adequate program and scope and scale, what are adequate resources are
going to be far more driven by the answer to that question than any
          others that we will be answering and addressing.
We're keenly aware of that here in the region and I think our
                                counterparts are, as well.
The last point I will make to you is that this is going to require an
enormous amount of intellectual honesty and integrity as we move
forward.  My sense is that all of the senior managers, from Dr. Travers
and Sam Collins on down to the regional administrators and the people
here, are intent upon coming up with the right answers, and I trust that
with the number of people involved and with the give-and-take that will
occur over this next six to nine months, that we will be able to come up
                                        with good answers.
Maybe at the end we'll end up with some questions and uncertainty and
I'm hoping and trusting that we will stand up and if there is an
uncertainty and something yet to be proved, we'll say that.  But the
last impression that I still want to come back to is one of we're
              excited about embarking on this new program.
DR. POWERS:  The Commission looks at it as they deliberate as a linchpin
in the whole strategy and, quite frankly, much of it is culling their
view on all kinds of other issues that they're deciding now in
    anticipation that we can work it out and make it work.
It is a crucial thing and well worth your time to spend management
attention on.  It's affecting everything.  It's affecting deliberations
on the maintenance rule, affecting deliberations on 50.59, 50.55(a), all
these things are being affected, because the Commission says, well,
yeah, in the new performance oversight, we're going to have this, and so
                         we don't need these other things.
   That puts a lot of pressure on us to do this one right.
         Do other members have comments they want to make?
                                            [No response.]
DR. POWERS:  Well, I think we're going to lose a quorum here in about
                                             four minutes.
I want to thank you for some tremendous briefings and I guess I'm -- I
come away very excited about Region I.  I think we're going to have to
                                         come back to you.
                         MR. SINGH:  Don't tell them that.
DR. POWERS:  I'm getting real curious about what's going on down at
      Limerick.  I've heard some interesting things there.
But I really appreciate you taking the time to talk to us about your
programs.  I'm very, very much pleased to get this insight.  I think one
of the ramifications of this program is the regions are becoming more
and more important.  I just don't see how they don't become more
                            important in this new program.
And so we're going to continue our efforts to get out to the regions as
often as we can.  And when we come back, and I guarantee you we'll be
back, understand this is a chance for you to tell us what we should be
doing, and I've gotten enough notes here on what I should be doing, and
          I guess Tom is going to be very busy next month.
                                But I thank you very much.
                                   MR. MILLER:  Thank you.
DR. POWERS:  And one of these days, I'll be able to take notes as fast
                                           as Diane talks.
DR. LARKINS:  We appreciate the hospitality.  We recognize you face a
                                  lot of work.  Thank you.
       [Whereupon, at 3:28 p.m., the meeting was concluded.]
 

Page Last Reviewed/Updated Tuesday, July 12, 2016