1 Introduction
The main objective of a safety analysis is to demonstrate that all safety
requirements are met, i.e. that sufficient margins exist between real values of
important parameters and their threshold values at which damage of the barriers
against release of radioactivity would occur.
Historically, the existence of a consistent safety margin in respect to the
licensing limits has been demonstrated using conservative approach. These
approaches were introduced to circumvent uncertainties due to limited capability for
modelling and understanding of physical phenomena at the early stages of safety
analysis. However the use of conservative assumption maybe so conservative that
important safety issues can be masked. Another drawback connected with the use of
the conservative approach is the impossibility to assess the exact safety margins,
resulting in economical penalties for the owner of the plant. Therefore, it may be
preferable to use a more realistic approach together with an evaluation of the related
uncertainties to compare with acceptance criteria. This type of analysis is referred to
as a Best Estimate Plus Uncertainty (BEPU) approach and can provide more realistic
information about the physical behaviour, identifying the most relevant safety issues
and supplying information about the actual existing margins between the results of
calculations and acceptance criteria.
Various options exist for combining computer codes types and input data for safety
analysis. In [47] four options are identified.
• Option 1 approach is the “very conservative” or Appendix K (of 10 CFR
50.46, USA) analysis in the case of LOCA. Many regulatory bodies prescribe
the conservative models/correlations to be used for safety analysis and the
conservative assumption for the initial and boundary condition to be used for
the analysis.
• The second approach is called “realistic conservative”, is similar to the first
one except for the fact that best estimate computer codes are used instead of
conservative codes. However, it must be noted that in certain countries option
2 is considered a conservative analysis.
• The option 3 assumes that the initial and boundary conditions are taken as
realistic with consideration of their uncertainties. From the point of view of the
Page 26 of 245
computer codes used and assumptions regarding availability of systems, the
approach is the same as option 2. In several countries, such as the USA, option
3 is best estimate analysis with uncertainty evaluation or Best Estimate Plus
Uncertainty (BEPU). A summary of the main methods used for the uncertainty
evaluation is given in chapter 3. More emphasis will be dedicated to the GRS
methodology and the CIAU method, considering that two application of these
methods have been performed during this thesis work and the results are
presented in chapter 4. In real practice, the mixture of option 2 and option 3 is
employed. By the way all options mentioned make conservative assumptions
regarding the availability of the systems.
• The option 4 is the most rigorous approach. It consists in a realistic analysis
for quantifying the availability of systems, significant from safety point of
view. The availability is usually quantified based on PSA based assumptions.
This option would also contribute towards risk informed regulation.
Recent advances in the best estimate codes and the introduction of new
uncertainty evaluation methods are gradually replacing the conventional
conservative evaluation methods.
Thermal-hydraulic system code calculations are affected by unavoidable errors
arising from several causes: including the unavoidable approximations in the
constitutive equations, the limited capabilities of numerical solution methods, the
uncertainties in the knowledge of boundary and initial conditions, and errors in
setting up the nodalization, etc... These can be characterized by hundreds of
parameters that are typically part of the input deck for a system code calculation
suitable for predicting a transient scenario in a NPP. This happens notwithstanding
the high code performance level and the systematic qualification processes,
nowadays in progress or completed. It is necessary to remind, in this connection,
that the user choices strongly affect the code results, through the so called "user
effect".
The BE codes, as already mentioned, are applied to the safety:
• in combination with a reasonably conservative selection of input data and a
sufficient evaluation of the uncertainties of the results;
• with realistic initial and boundary conditions.
Page 27 of 245
Both options are considered acceptable and suggested by the existing IAEA Safety
Standards [47]. The option 2 is still more typically used at present for safety analysis
in many countries. The international activity aims at the code validations as well as
various evaluations of data uncertainties, and sensitivity studies helps to establish
confidence in calculated results.
This thesis work has to be considered in the framework of the development and
application of the uncertainty methods for the licensing of water cooled reactors.
The objective of this work is provide a proof of the application of uncertainty
methodology during the licensing of a nuclear reactor and contributes to further
development of the uncertainty analysis methods.
1.1 Historical background
The majority of the actual NPP plant were designed on the traditional defence in
depth philosophy, and licensed with the use of a conservative approach (see section
2.6) for the demonstration of the safety in relation of the DBA, intended as a
minimum set of enveloping scenarios whose positive-conservative evaluation could
ensure that an adequate level of protection is provided by the designers. This kind of
procedures, that governed analysis, were established in 1974 when USNRC
published rules for LOCA analysis in 10CFR 50.46 and Appendix K [1].
The basic reason for developing the conservative method has been the need to
circumvent the lacks of knowledge of the physical phenomena. It is approach based
on the notions of consequences (maximisation) and criteria (restrictive). When
questions were raised whether plants could be considered as safe, the usual answer
was first that criteria had been set up to ensure that if they were satisfied, nothing
reprehensible could occur, and secondly that plant behaviour was evaluated with
large conservatisms so that to ensure that the plant was on the right side of the
preceding criteria. This, of course, meant that some "distance" existed between the
most severe state of the plant and the criteria. This "distance" which was the result
of the combination of all kinds of conservatisms (without making any classification)
appeared as an additional margin to the criteria, which already was guaranteeing by
themselves the safety of the plant. The concept of safety margin was then created.
This conceptual two-prong approach define a safety limit, and stay under it is what
is most commonly understood as having “adequate safety margin” in the nuclear
Page 28 of 245
industry.
Problems raised by conservative approach are:
• no way to prove that the conservatism’s which are verified on scaled down
experiments are also valid at full scale reactor size;
• due to nonlinearity, the additivity of several conservative measures cannot be
verified;
• method un-adapted for Emergency Operating Procedures (EOP) studies
(especially obvious after TMI2 accident);
• unknown margin between the calculated result and real value of the peak of
specific key parameters results in economic penalizations, in other word is
impossible to known how large is the safety margin.
Thus, nor the ‘safety margins’ could be established in a quantitative manner, neither
the optimization of a safety solution could be demonstrated. All these limitations
have been the motivation for developing best estimate codes.
Research during ‘70s and ‘80s provided a foundation sufficient for use of
realistic and physically based analysis methods. Large number of experimental
programs were completed internationally. A number of advanced computer codes
(BE) were developed in parallel with experiments for replacing evaluation model:
RELAP, TRAC, COBRA-TRAC, RETRAN, CATHARE, ATHLET etc. As a result
of this huge effort, in September 1988, the NRC approved a revised rule for the
acceptance of ECCSs [2].
The revised rule of ECCS contains three key features: the existing acceptance
criteria were retained; evaluation model methods based on Appendix K may
continue to be used as an alternative to best estimate methodology; and an alternate
ECCS performance, based on BE methods, may be used to provide more realistic
estimates of plant safety margins, provided the licensee quantifies the uncertainty of
the estimates and includes the uncertainty when comparing the calculated results
with prescribed acceptance limits. The use of BE code do not overcame the
uncertainties related on the status of the plant and, moreover, the code itself
introduces error and uncertainty (see section 2.3 “Source of Uncertainty”),
consequently a BE estimate analysis without the quantification of the error in
predicted the results is a no sense.
To support the revised ECCS rule and illustrate its application, USNRC and its
Page 29 of 245
contractors and consultants have developed and demonstrated an uncertainty
evaluation methodology called CSAU. The CSAU was demonstrated to LBLOCA
[3], later in 1992 it was applied to SBLOCA [4]. First non US CSAU application to
plant was done in 1993 [5]. After pioneering CSAU method in the next five years
several new original methods were developed. At special OECD/NEA workshop on
uncertainty analysis methods, London 1-4 March 1994 [6] eight new methods were
presented: CSAU, UMAE method (Uncertainty Methodology based on Accuracy
Extrapolation, Italy) [7], AEA method (Atomic Energy Authority, UK), NE Method
(Nuclear Energy, UK) [8], GRS method (Gesellschaft für Anlagen-und
Reaktorsicherheit, Germany) [9], IPSN method (Institut de Protection et de SuretØ
NucleairØ, France), Tractebel method (Belgium) and Limit value approach (ABB,
USA).
More importantly, these methods have progressed far beyond the capabilities of
the early CSAU analysis. At present, uncertainty bands (both upper and lower) can
be calculated for any desired quantity throughout the transient of interest, in addition
to point values like the PCT. One method, namely the internal assessment of
uncertainty (CIAU, University of Pisa) [18], also includes the capability to assess
the calculation uncertainty in a code subroutine while the transient progresses.
1.2 Purpose of the thesis
The objective of this work is provide a proof of the application of uncertainty
methodology during the licensing of a nuclear reactor and contributes to further
development of the uncertainty analysis methods.
The GRS method has been applied as a support of the activity aiming to address
a key safety issue for the ATUCHA-2 nuclear power plant, see section 4.2.
Objective of this first activity is the quantification of the uncertainty associated to
the prediction of the void average production in the CNA-2 reactor core using a BE
code.
The second activity carried out during this thesis work is the simulation of two
experiment performed on the LOBI/MOD1 test facility, with the BE code
RELAP5.33, plus the application of the CIAU method for take in to account the
uncertainty.
Page 30 of 245
1.3 Structure of the thesis
The present work has been organized in five chapters including the present one
dealing with introductory remarks and the organization of the performed work.
In chapter number two are first discussed the definition of: uncertainty,
accuracy and sensitivity, in order to avoid misunderstanding. The aim of every
safety analysis is, as already state, verify that a safety margins exist between the
actual state of the plant and the threshold limit under every condition, so the second
section of this chapter point out on the evolution of the safety margins concept. The
chapter ended with a discussion of the four type of analysis today existing and
accepted by the regulatory body for the deterministic safety analysis.
A state of the art on the uncertainty methods developed by the international
community in the latest decade is provide in chapter 3. A revision of the various
approach recognized by the IAEA to performed a safety analysis, either for design
or licensing purpose is provided at the end of chapter 2. The salient features of three
independent approaches for estimating the uncertainties are reviewed in respect with
relevant topics to be considered and addressed by a consistent uncertainty
methodology. The uncertainty methods today used by the industry and the regulator
are reviewed starting from the first pioneering approach, namely the CSAU. All the
methodology developed in the recent year are less or more derived or connect by
this first pioneering procedure. The description focuses mostly on the GRS method
and the CIAU method, highlighting benefits and drawback of the two method. The
goal of this part is to give the necessary information for understand the applications
of these two methods done during these thesis work an presented in chapter 4. The
chapter ended with a comparison with the two approach.
In chapter 4 the two activity carried out during this thesis work are reviewed.
Considering the importance of the quality of the computational tool used for a safety
analysis, the first section of this chapter deal with this argument, highlighting the
importance of high level quality of code, code user and uncertainty method and how
achieve this quality. In section 4.2 the GRS activity is reviewed, starting with a
description of the ATUCHA-2 NPP, necessary to understand the importance of the
topic addressed, the chapter continues with a description of the objective, the
procedures utilized and a discussion of the achieved results. The activity related to
the CIAU application to two test performed on the LOBI/MOD1 facility is described
Page 31 of 245
in section 4.3. The section started with a description of the facility: geometry and
scaling criteria. The RELAP5.33 model is after described and the procedures
follows to achieve a validated nodalization is described. The results of the
simulation are compared with the experimental data, both in steady-state and on-
transient condition. The last section describe the results of the CIAU application and
discuss the results.
Chapter 5 concludes the thesis summarizing the main achievements and giving
the recommendations for future works in the framework of the use of best estimate
code for licensing of NPP.
Page 32 of 245
2 The approach for licensing of W.C.R.
In the course of this chapter the evolution of the approach for the analysis of
NPP is outlined. The next sections (section 1.1 and 2.2) some definitions and
concept, necessary to better understand the follow-up of this thesis work are
critically reviewed. Successively In section 2.3 the origin of the uncertainty
associate to thermal-hydraulic system code is discussed underlying why an
uncertainty evaluation (UA) is need. In the section of this chapter (section 2.4) the
benefits and drawbacks of the different approach for the safety analysis, accepted
today by the nuclear safety authority all around the world are presented.
2.1 Useful definition: accuracy, uncertainty and sensitivity
The definition of accuracy, uncertainty and sensitivity are provided here, as they
are commonly accepted in the framework of the nuclear safety technology and so
fort in the sector of deterministic accident analysis.
Accuracy is defined “as the know bias between a code prediction and the actual
transient performance of a real facility” [47], and bias is defined as “measure of the
systematic difference between an actual or true value and a predicted or measured
mean” [47], so bias is the tendency of a model to over-predict or under-predict a
certain physical variable. Method exist which extrapolate the accuracy of calculation
results of experiments and their data to reactor conditions. Clearly, the evaluation of
the accuracy is possible only in the presence of measured data and the availability of
a calculation result. The experimental error is not part of the definition, however, in
the majority of the cases of interest for the safety analysis of Nuclear Power Plants
(NPP) the error that characterizes the measurement is much lower of the accuracy
that characterizes the comparison between measured and predicted values.
The definition of uncertainty is “Measure of scatter in experimental data or
calculated values”, [47]. It is expressed by an interval around the true mean of a
parameter resulting from the inability to either measure or calculate the true value of
that parameter (scatter).
An UA is defined “..an analysis to estimate the uncertainties and error bounds
of the quantities involved in, and the results from, the solution of a problem.
Estimation of individual modelling or overall code uncertainties, representation (i.e.
nodalization related) uncertainties, numerical inadequacies, user effects, computer
Page 33 of 245
compiler effects and plant data uncertainties for the analysis of an individual event”,
[47]. Depending on which method used in the UA, necessary supplement of a Best-
Estimate (BE) analysis, the knowledge about an uncertainty parameter is given as a
‘bounding’ range, ‘reasonably’ uncertainty range or as a probability distribution.
The sensitivity is, “..the study of how the variation in the output of a model
(numerical or otherwise) can be apportioned, qualitatively or quantitatively, to
different sources of variation, and how the given model depends upon the
information fed into it”, [10]. A sensitivity analysis aim to the quantification of the
degree of impact of the uncertainty from individual input parameter of the model on
the overall model outcome, can be also referred as uncertainty importance analysis.
The three kind of analysis above outlined are strictly connected but there are
notably difference in the reason why a certain analysis is performed and in the
objective of such study. An Accuracy Analysis (AA) is performed mainly for the
demonstration of the qualification for the computer codes, rely on the availability of
relevant experimental data and tools to characterized the discrepancies from the
qualitative and quantitative point of view [54]. The result of the AA is the
demonstration of the qualification level of the code and the characterization of the
range of parameters over which the code should be considered qualified. The (UA)
is performed to answer at the nuclear reactor safety principle of the defence in depth.
It ensure that the result of the BE code prediction, supplemented by the uncertainty
band, key result of the UA, are below the licensing limit and that an adequate safety
margin is preserved, (the concept of safety margin is discussed in section 2.2).
All that is needed for a meaningful SA is the model and the input value, while
UA need a reference value, typically not available, and AA, on the other hand needs
relevant experimental data. Furthermore, when performing SA, values of the
concerned input parameters are varied arbitrarily around the initial (or nominal)
value to a ‘small’ or to a ‘large’ extent depending upon the scope of the analysis;
when performing the UA, whatever is the method adopted, a range of variation for
the concerned input parameters must be assigned or available. SA may be a way to
perform UA if input parameters are properly selected with proper ranges of
variation. Moreover, SA and UA can be considered as formal methods for
evaluating data and models because they are associated with the computation of
specific quantitative measures that allow, in particular, assessment of variability in
Page 34 of 245
output variables and importance of input variables. Moreover, sensitivity and
uncertainty analysis procedures can be either local or global in scope [11]. The
objective of local analysis is to analyze the behaviour of the system response locally
around a chosen point or trajectory in the combined phase space of parameters and
state variables. On the other hand, the objective of global analysis is to determine all
of the system's critical points (bifurcations, turning points, response maxima,
minima, and/or saddle points) in the combined phase space formed by the
parameters and dependent (state) variables, and subsequently analyze these critical
points by local sensitivity and uncertainty analysis. Once the local sensitivities
become available they can be used for the following purposes: (i) understand the
system by highlighting important data; (ii) eliminate unimportant data; (iii)
determine effects of parameter variations on the system’s behaviour; (iv) design and
optimize the system (e.g., maximize availability/minimize maintenance); (v) reduce
over-design; (vi) prioritize the improvements effected in the respective system; (vii)
prioritize introduction of data uncertainties; (viii) perform local uncertainty analysis.
For what concern the SA is important to underline that the result of a
probabilistic approach is not an absolute value for the influence of a certain
parameter on the total uncertainty, but is a degree of the correlation between that
parameter and the output quantity of interest (see section 4.2.2.6.3)
The method used for performing UA an SA can either be based on deterministic
or probabilistic approach, in practice, despite can both be used for local or global
analysis, deterministic methods are used mostly for local analysis while statistical
methods are used for both local and global study.
Page 35 of 245
2.2 The Concept of Safety Margin
The objective of the safety analysis for a NPP is to verify that an adequate
“safety margin” exist, it is not important in which way the analysis is performed (i.e.
if BEPU or conservative).
The concept of safety margin was originally strongly linked to the Design Basis
Accident (DBA), because this kind of scenario was considered as the major safety
case. After the Three Miles Island 2 (TMI 2) accident, the DBA appear not to be the
only safety concern, transients of several types, operating procedures and severe
accidents became integral parts of the safety analysis, and the safety margin concept
continued to be used in this more general framework.
So, the concept of safety margin had been extended beyond the DBA, it
represented, in a very qualitative way, some distance of the plant state to either a
safety criterion or a feared situation, or a failure limit of any system or physical
barriers. In this way the concept became more qualitative than before without any
clear definition. And it considered limits other than the one corresponding to the
safety criteria. For some experts the safety margin refer only to the safety criteria
while for other experts is relative to a value above the safety criterion, which amount
for the extra “margins” introduced by setting the safety criterion in a conservative
manner. The objective of this chapter is to set a definition according to [47]. In the
context of this thesis we will refer at the safety margin concept only in relation of
the licensing limit and not to failure limit.
The general definition of margin to damage often referred to as safety margin in
the literature, [12]. The safety margin in the conventional industry is related to the
probability of failure, this is why margin emerged as the sole proxy for reliability in
many applications. Figure 2-1 show the concept of probability of failure (in the field
of civil engineer) and if the PDF of parameter are know also the probability of
failure is know.
In nuclear industry the probability function for strength of fuel or containments
are prohibitively expensive to obtain. Furthermore the two-prong approach to
ensuring safety margin adopted in the nuclear industry, leads itself much better to
inspection by examining the probability of exceeding the safety limit than
calculating the actual probability of failure. In relation to the DBA discussions, an
Page 36 of 245
“adequate safety margins” are linked to safety limits-limiting values imposed on
safety variables ( i.e. Peak Cladding Temperature, PCT). Thus, when operating
conditions stay within safety limits, the barrier or system has a negligible probability
of loss of function, and an adequate safety margin exists. Therefore, the first prong
of ensuring adequate safety margin is to set safety limits such that the probability of
loss of function is negligible, so long as operating conditions stay within those
criteria (see Figure 2-2). The safety variable characterizing the operating condition
are those set by NRC in [1].
Figure 2-1 - The probability of failure in an event sequence, [12].
The second prong of ensuring adequate safety margin is to keep operating
conditions within safety limits (see Figure 2-3). The object of the safety analysis
using either a conservative or a BEPU approach is to estimate the upper bound of
the given variable.
Page 37 of 245