This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ETL 11102556
28 May 99 Appendix A
An Overview of Probabilistic Analysis for Geotechnical
Engineering Problems
Table of Contents
Page
Purpose and Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A2
Probabilistic Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A2
Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A2
Background of Corps’ Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A2
Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A3
Risk Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A3
Event Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A3
Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A4
Time Basis of Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A5
Hazard Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A5
Fault Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A5
Further References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A5
Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A6
Random Variables and Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A6
The Lognormal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A7
Moments of Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A7
Fitting Distributions and Moments to Test Data . . . . . . . . . . . . . . . . . . . . . . . A7
Typical Coefficients of Variation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A8
Independent and Correlated Random Variables . . . . . . . . . . . . . . . . . . . . . . . . A8
Spatial Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A9
FirstOrder SecondMoment Reliability Methods . . . . . . . . . . . . . . . . . . . . . . . A10
The Reliability Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A10
Probability of Failure or Unsatisfactory Performance . . . . . . . . . . . . . . . . . . . A11
Taylor’s Series Mean Value Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A11
Point Estimate Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A11
Hasofer  Lind Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A12
Monte Carlo Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A12
Some Comments on the Use and Meaning of or Pr(u) . . . . . . . . . . . . . . . . . . . A13
Potential for Overlooking Some Performance Modes . . . . . . . . . . . . . . . . . . . A13
Physical Meaning of Probability of Failure for Existing Structures . . . . . . . . A13
Lack of Time Dimension in FOSM Methods . . . . . . . . . . . . . . . . . . . . . . . . . A14
Frequencybased Reliability Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A14
Subjectively Determined Probability Values . . . . . . . . . . . . . . . . . . . . . . . . . . . A14
System Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A15
Special Issues in Geotechnical Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . A15
Some Differences Between Geotechnical and Structural Problems . . . . . . . . . A15
Strength Parameters from Triaxial Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . A16
Freebody and Critical Slip Surface Issues in Slope Stability Analysis . . . . . . A16
Application of Spatial Correlation Theory to Slope Stability and Seepage
Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A17
Application of Spatial Correlation Theory to Long Earth Structures . . . . . . . A17
Examples of Probabilistic Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A18
Wappapello Dam, St. Louis District . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A18
Shelbyville Dam, St. Louis District . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A18
Research on Navigation Structures for Guidance Development . . . . . . . . . . . A18 A1 ETL 11102556
28 May 99
Research on Levees for Guidance Development . . . . . . . . . . . . . . . . . . . . . .
Hodges Village Dam, New England Division . . . . . . . . . . . . . . . . . . . . . . . .
Walter F. George Dam, Mobile District . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A18
A18
A18
A19
A19 Purpose and Scope
This appendix provides an overview of the application of probabilistic methods to
geotechnical engineering problems of interest to the Corps of Engineers, with emphasis on
methodology suitable for assessing the comparative reliability of dams, levees, and other
hydraulic structures in the context of planning studies. A number of probabilistic methods
that can and have been applied to the problems of interest are reviewed and discussed.
These are drawn from Corps guidance, literature that led to Corps guidance, literature and
methodology not yet in Corps guidance but considered stateoftheart, case histories of
past analyses by the Corps and by others for similar problems, and recent remarks made in
stateoftheart invited papers. The intent of this review is to introduce the reader to the
diversity of methodology and issues that are encompassed in geotechnical probabilistic
analysis, and their relationships to each other and Corps methodology, so that the relative
accuracy, advantages, and limitations of Corps’ methodology can be better understood in
this context.
Probabilistic Methods
Background. As used herein, the term probabilistic methods refers to a collection of
techniques that may be called or include reliability analysis, risk analysis, riskbased
analysis, lifedata analysis, and other similar terms. Such techniques have been under
development and have seen increasing application to engineering problems for 50 years,
starting with Frudenthal (1947). Since that time, and increasingly in the last 20 years, a
significant body of literature has been published, proposing and detailing various
methodologies and applications. Application to structural engineering problems,
especially as the basis of design codes (e.g., Ellingwood et al. 1980), has generally
preceded applications in geotechnical engineering. Geotechnical problems often involve
certain complexities not found in structural problems.
Background of Corps’ Applications. As the Corps’ workload shifted from the design
of new structures to the rehabilitation of existing structures, it became necessary to
develop rational methodology to compare alternative plans for rehabilitation of Corps’
projects and prioritize expenditures for such work. The previous approach of seeking
funds on the basis that a structure does not meet current criteria is unworkable when funds
are insufficient for all desired rehabilitation projects (U.S. Army Corps of Engineers
1992). The resulting approach has been to apply risk analysis techniques. In such a risk
analysis,
• Unsatisfactory performance events are identified and the probabilities of their
occurrence over some time frame are estimated.
• Consequences of the unsatisfactory performance events are estimated.
• Changes in probability and consequences associated with alternative plans of
improvement are estimated. A2 ETL 11102556
28 May 99
• Decisions are made based on the quantified risk and costs and benefits of
reducing the risk.
Since 1992, the Corps has used probabilistic methods to evaluate engineering reliability in
the planning process for major rehabilitation projects. The methodology used by the
Corps has been selectively adapted from previously published work (e.g. Moses and
Verma 1987; Wolff and Wang 1992; Shannon and Wilson, Inc., and Wolff 1994; Wolff et
al. 1995) and a limited amount of guidance has been published (e.g., U.S. Army Corps of
Engineers 1992, 1993, 1995a, 1995b). Methodology is under development for planning
studies for levee projects, and is under consideration for dam safety evaluation.
Nevertheless, the application of probabilistic methods is an evolving technology. As the
Corps’ experience base expands and new and unique problems are considered, it will
continue to be necessary to identify suitable methodology, either drawn from outside
sources or developed within the Corps, ahead of its publication as Corps guidance. Framework. To account for various modes of performance and estimate the required
probabilities of unsatisfactory performance within a time frame, engineers and planners
develop an event tree and engineers estimate probability values for a number of events and
conditional events leading to various performance states of the structure or component.
(Event trees are further discussed in the next section). Event trees are a convenient
pictorial method to represent complex networks of conditional probability problems. They
are not in themselves related to any single probabilistic method. The required probability
values could be estimated using one or more of three approaches:
a. Calculating the probability of unsatisfactory performance as a function of
uncertainty in parameter values and in the analytical models, typically using
firstorder secondmoment methods or simulation (Monte Carlo) methods.
b. Calculating the probability of occurrence of various events from timebased
probability distributions based on the study of historical records of similar
events and fitting probability functions to these data.
c. Estimating the probability of event occurrence (either within a time increment or
conditional on a preceding event) by a systematic process of eliciting expert
opinion and developing a consensus regarding the required values.
This classification of three approaches is similar to that described by Vick and Stewart
(1996). A broad treatment of probabilistic methods, including some or all of these
approaches is contained in a number of general texts. Notable among these are Ang and
Tang (1975, 1985), Benjamin and Cornell (1970), Hahn and Shapiro (1967), Harr (1987),
and Lewis (1996). The following sections further describe event trees and the above three
probabilistic approaches.
Risk Analysis
Event Trees. The framework for risk analysis in most Corps’ planning studies is an
event tree. An event tree is a pictorial representation of sequences of events that may lead
to favorable or unfavorable outcomes. A simple example of part of an event tree is shown
in Figure 1. Each node on the tree represents a situation where two or more mutually
exclusive events may occur, given that events leading to the node have already occurred.
For each branch from a node, a conditional probability of occurrence is assigned
(conditioned on reaching the node via the preceding events). The set of conditional
probability values emanating from each node must total to unity. In accordance with the
A3 ETL 11102556
28 May 99 Figure 1. Partial event tree for slope stability given maximum water elevation in a
time increment total probability theorem, multiplying values along any path through the tree gives the
probability of the outcome at the end of the path.
Example. T he example in Figure 1 considers slope stability for a range of water
elevations and illustrates how all three of the abovenoted approaches may enter an event
tree. Given a probabilityofannualexceedance function for water level, a set of water
levels can be discretized for analysis. For example, the probability that the maximum
water level in a 1year time increment is between elevation 498 and 500 can be taken as
the difference in annual probabilities of exceedance for those elevations. For a slope
stability, seepage, or other waterleveldependent analysis, the water level can be taken at
the midpoint of the increment, i.e., 499. Probability of exceedance values for water levels
are typically obtained using the second of the three approaches cited above, i.e.,
probability distributions are fit to historical data.
Given that the water level reaches elevation 499, there may or may not be a sudden
drawdown event while the water level is at this elevation. The probability of this event
might be estimated using frequency analysis of historical events. On the other hand, past
events may be so sparse or dissimilar that probability can only be estimated by judgment
(the third approach). Furthermore, the probability of drawdown may be a constant value
per year, if its occurrence is totally random, or its value per year might be taken to increase
with increasing water level if the likelihood of operational problems is considered to
increase with water level.
Given a sudden drawdown event, the slope may or may not fail. As an analytical model
and some understanding of the uncertainty in the model parameters are available for
stability under a sudden drawdown condition, the conditional probability of failure given
sudden drawdown can be estimated using firstorder secondmoment methods such as the
Taylor’s series method or the point estimate method. Note from the example in Figure 1
that the conditional probability of slope failure given sudden drawdown may be relatively
high (0.20), but the preceding event of sudden drawdown might be quite low, leading to an
overall low probability for the outcome of a sudden drawdown failure. A4 ETL 11102556
28 May 99
Time Basis of Reliability. Risk analyses for economic planning generally consider
the risks in some defined time frame, typically 50 years. If the eventtree analysis is to
determine the probability of unsatisfactory performance within some time increment, one
of the underlying random variables must have a timebased definition, e.g. an annual
probability of failure or an expected value of 0.xxxx failures per year. In the example
shown, the time component is in the annual probability of exceedance function for water
level. In the case of an electrical or mechanical part, the probability may have a time
component related to time in service.
Hazard Functions. A hazard function gives the conditional probability of failure (or
probability of event occurrence) per time increment given that no failure or event has
occurred up to the considered time. Where timebased probability values are equal in each
time increment, the hazard function has a constant value. This is referred to as a Poisson
process and the lifetime is exponentially distributed. Floods and earthquakes are often
assumed to be Poisson processes, with occurrence taken to be equally likely in any year.
An increasing hazard function implies an increasing probability of failure or event
occurrence as time elapses without such an event. An example of an increasing hazard
function might be one for the formation of a window in a sheetpile cutoff due to corrosion,
or the breakout of a seepage condition due to solutioning of limestone. An example of a
decreasing hazard function might be one for the event of an undrained slope failure as
pore pressures dissipate with time.
Fault Trees. An alternative to an event tree is a fault tree. Where an event tree starts
with some initiating event (e.g. high water, or simply a year in the project life) and
attempts to consider all subsequent possibilities, in a fault tree analysis one first identifies
an outcome event of interest (e.g. loss of pool) and works backward to identify the
necessary antecedent events. An advantage of fault tree analysis is that it may save time
and be easier to accurately develop when specific and alreadyidentified outcomes are of
interest. For economic analysis of proposed rehabilitation projects, the event tree format
has been preferred. However, if and as probabilistic methods are applied to dam safety
issues, fault tree analysis may have some advantages. Vrouwenvelder (1987) uses fault
tree analysis to assess the failure probability of Dutch levees.
Further References. Ang and Tang (1985) and Lewis (1966) both provide a number
of detailed and illustrated examples of both event tree and fault tree analysis. Wu (1996)
and Whitman (1996) also provide a brief treatment of event tree methodology in a
geotechnical context.
Recent applications of eventtree analysis involving geotechnical problems at Corps
projects include the Hodges Villages Dam rehabilitation report (U.S. Army Engineer
Division, New England 1995) and the Walter F. George Dam rehabilitation report (U.S.
Army Engineer District, Mobile 1997). In the Hodges Village Dam study, the initiating
and timerelated event is the occurrence of one of several maximum annual pool levels,
each with some probability. Given each pool level, subsequent events are the occurrence
of uncontrolled seepage leading to failure at one or more locations. The required
conditional probabilities of seepage failure given pool level were developed using firstorder, secondmoment (FOSM) reliability methods in conjunction with finiteelement
seepage analyses.
In the Walter F. George study, the initiating and timerelated event is the occurrence of
excessive seepage in a solutioned limestone foundation. These are taken to be Weibull
distributed with an increasing hazard function, which was fit to historical events at the site
with some measure of judgment regarding the acceleration rate. Given such a seepage
event, the event tree is filled out with conditional probabilities related to how well the seep
A5 ETL 11102556
28 May 99
is connected to pool and/or tailwater, and how likely or unlikely it is that the source of the
seep will be detected and plugged before uncontrolled erosion occurs.
In both studies, the recommended remedial action was the construction of concrete cutoff
walls in the foundation.
Random Variables
Random Variables and Distributions. The fundamental building blocks of
probabilistic analyses are random variables. In mathematical terms, a random variable is
a function defined on a sample space that assigns a probability or likelihood to each
possible event within the sample space. In practical terms, a random variable is a variable
for which the precise value is uncertain, but some probability can be assigned to its
assuming any specific value (for discrete random variables) or being within any range of
values (for continuous random variables).
Discrete random variables can only assume specific values. Some examples of discrete
random variables encountered in geotechnical engineering include:
• Number of sand boils or seeps that may occur within length L in time period t.
• Number of levee overtoppings in length L in time period t.
• In general, the number of events in an increment of time or space.
Commonly employed models for discrete random variables include the binomial and
Poisson distributions.
Continuous random variables can assume a continuous range of values over a domain, and
probability values must be associated with some range within the domain. Some
continuous random variables include:
• Undrained strength or cohesion of a clay stratum.
• Friction angle.
• Permeability.
• Exit gradient at the toe of a levee.
• Time to occurrence of an erosive seepage or scour event.
• Time to occurrence of any event.
Commonly employed models for continuous random variables include the normal,
lognormal, and uniform distributions; however, there are a number of others, such as the
beta distribution discussed by Harr (1987). Random variables are discussed in some detail
(distributions, moments, etc.) in standard texts (Ang and Tang 1975, 1985; Benjamin and
Cornell 1970; Hahn and Shapiro 1967: Harr 1987; Lewis 1996), Corpssponsored research
reports (Wolff and Wang 1992, Shannon and Wilson, Inc., and Wolff 1994; Wolff et al.
1995) and in Corps’ guidance (U.S. Army Corps of Engineers 1992, 1995b). An overview
of random variables in a geotechnical context has also been provided by Gilbert (1996). A6 ETL 11102556
28 May 99
It should be noted that the selection of any probability distribution (e.g. the lognormal) to
characterize a random variable (e.g., the factor of safety) is essentially an assumption,
made because certain distributions facilitate computations. It cannot in general be proved
that a random variable fits a certain distribution, although the goodness of fit between a
data set and one or more candidate distributions can be assessed by some standard
statistical tests, such as the Chisquared and KolmogorvSmirnov tests, found in most
statistical texts.
The Lognormal Distribution. The lognormal distribution is of particular interest in
geotechnical reliability analysis, as it has certain properties similar to that of some
commonly encountered random variables:
• It is a continuous distribution with a zero lower bound and an infinite upper
bound.
• As the log of the value is normally distributed, rather than the value itself, it
provides a convenient model for random variables with relatively large
coefficients of variation (>30%) for which an assumption of normality would
imply a significant probability of negative values.
Some random variables often assumed to be lognormally distributed include the
coefficient of permeability, the undrained strength of clay, and the factor of safety. The
details for making the required transformations to fit lognormal distributions are given in
recent Corps’ geotechnical guidance (U.S. Army Corps of Engineers 1995b), taken from
Shannon and Wilson, Inc., and Wolff (1994).
Moments of Random Variables. When calculating the reliability index or probability
of failure by firstorder secondmoment methods, only the moments of a random variable
are required; the exact distribution is not required. The first moment about the origin is
the mean or expected value; the second central moment is the variance. (Central moments
are calculated with respect to the mean). The square root of the variance is the standard
deviation, and the ratio of the standard deviation to the expected value is the coefficient of
variation. Calculation of moments is discussed in the references previously cited.
Fitting Distributions and Moments to Test Data. In geotechnical engineering
problems, a limited amount of test data is often available to help estimate the moments of
parameters of interest (typically strength or permeability). Using standard statistical
techniques, the mean and standard deviation of a set of test results can be used to estimate
the mean (or expected value) and standard deviation of the random variable.
The sample mean is an unbiased estimator of the true or population mean. Hence the best
estimate of a parameter mean is always the mean of a representative data set. However,
with equal likelihood, the sample mean may be greater or less than the true mean, which is
unknown. The mean value measured from a randomly selected data set is normally
distributed about the true mean with a standard deviation equal in magnitude to the
standard error of the mean. This error decreases in proportion to the square root of the
sample size.
The standard deviation of the sample values is a biased estimator of the population
standard deviation. As the uncertainty or variability of values in a large or infinite
population is generally greater than that which is measured in a finite sample, estimating
the population standard deviation requires increasing the sample standard deviation by an
amount which decreases with the square root of the sample size. In other words, the
uncertainty in the value of a property at a random point is somewhat greater than the
A7 ETL 11102556
28 May 99
standard deviation calculated from a finite number of tests. Sampling and parameter
estimation are further discussed by Harr (1987) and other statistical tests.
Once the moments of the random variable have been estimated as described above, what
one actually has is a measure of the uncertainty in the value that would be measured if
another sample were tested from a random point in the soil. This value may be referred to
as the point value. For example, cohesion of clay samples may be measured to estimate
the mean and standard deviation of cohesion, which represents the cohesion value at a
random point within the same deposit. However, the uncertainty measure required in a
seepage or slope stability analysis is typically not the uncertainty in the value at a random
point, but rather the uncertainty in the average value over some length. This requires that
the variance be further adjusted as discussed later under the heading spatial correlation.
Accounting for spatial correlation generally leads to some reduction in variance. Hence,
estimating an appropriate variance to use in a probabilistic analysis, starting from lab or in
situ test values, involves a twostep correction procedure:
• Increasing the sample variance, to obtain the point variance.
• Decreasing the point variance, to obtain the variance of the spatially averaged
value required in the analysis.
Some examples of estimating moments for geotechnical parameters of interest to Corps
studies are given in Wolff and Wang (1992); Shannon and Wilson, Inc., and Wolff (1994);
Wolff (1994); and Wolff et al. (1995). However, these examples do not all include
adjustments for spatial correlation effects.
Once the mean and standard deviation of the random variable (either the point value or the
spatially averaged value) have been estimated, and perhaps some other assumptions are
made, a distribution function (e.g., normal or lognormal) can be assumed if desired and the
distribution on the point value can be plotted and visualized.
Typical Coefficients of Variation. Where sitespecific data are not available to
estimate parameters of random variables, uncertainty can be characterized by assuming
that the coefficient of variation of a parameter is similar in magnitude to that observed at
other sites. Typical values of coefficients of variation for soil properties have been
compiled and reported by Harr (1987). Some example values for parameters involved in
stability analysis of gravity monoliths are given by the U.S. Army Corps of Engineers
(1993). Compilations for soil strength, permeability, and other parameters of interest to
Corps’ studies are given in Shannon and Wilson, Inc., and Wolff (1994), and Wolff et al.
(1995). Some recent compilations by others include one for soil properties by Lacasse and
Nadim (1996) and one for in situ test results by Kulhawy and Trautman (1996).
However, care must be taken when using such typical values, as coefficients of variation
alone do not define the correlation structure of soil properties, which are defined over a
continuum and are spatially correlated. This is further described later in this report under
the heading spatial correlation.
Independent and Correlated Random Variables. Independent random variables are
those for which the likelihood of the random variable assuming a specific value does not
depend on the value of any other variable. Where the value of a random variable depends
on the value of another random variable, the two are said to be correlated. Some
examples of random variables that may be correlated are: A8 ETL 11102556
28 May 99
• Unit weight and friction angle of sand.
• Preconsolidation pressure and undrained strength of clay.
• The c and parameters in a consolidatedundrained strength envelope.
Where random variables are correlated, their probability distributions form a joint
distribution, and one additional moment, the covariance, is necessary to model the
parameters when using secondmoment methods. An alternative way to express the
interdependence is with the correlation coefficient, which relates the covariance to the
variances of the two variables.
Calculation of correlation coefficients is further discussed by Tang (1996), U.S. Army
Corps of Engineers (1992, 1995a, 1995b), and standard statistical texts. Some
investigation into the values of the correlation coefficient between the c and parameters
for various soil materials is reported by Wolff (1985), Wolff and Wang (1992), and Wolff
et al. (1995). However, the results are not so consistent as to permit the recommendation
of typical values that could be assumed without statistical analysis on specific data.
The effect of parameter correlation is to increase or decrease the total uncertainty,
depending on whether correlation is positive or negative. Although parameter correlation
can be shown to significantly affect the results of probabilistic analysis, independence of
random variables is often assumed in probabilistic analysis. This may be done for two
reasons, both computational simplicity and the fact that data are often insufficient to make
reliable estimates of the required correlation coefficients.
Spatial Correlation. Random variables that vary continuously over a space or time
domain are referred to as random fields. In a random field, the variable exhibits
autocorrelation, the tendency for values of the variable at one point to be correlated to
values at nearby points. For example, if one measures the value of soil strength at some
point, the uncertainty in the value at a nearby point (say a few feet away), becomes less
uncertain, as it is highly correlated to the value of the first point. On the other hand,
values measured at considerable distances, say a few hundred feet, may be essentially
independent. To characterize a random field, the mean and standard deviation (or
variance) are required, plus some quantification of the correlation stucture. The
correlation structure typically is defined by a correlation function, which models the
reduction in autocorrelation with distance, and a characteristic length or correlation
distance, a parameter which scales the correlation function.
A classic paper introducing spatial correlation concepts to the geotechnical profession was
published by Vanmarcke (1977a). Some recent papers further summarizing the concept
include those by DeGroot (1996), Fenton (1996), Lacasse and Nadim (1996), and Phoon
and Kulhawy (1996). Some aspects of applying spatial correlation theory have been
summarized in a set of simple examples prepared by Wolff (1996c) for the St. Louis
District.
To date, spatial correlation concepts have generally not been used in Corps’ studies. The
methodology in Corps’ guidance (U.S. Army Corps of Engineers 1992, 1995b), as well as
the related research previously quoted, considers only the expected value and coefficients
of variation of random variables and neglects their spatial correlation structure. This has
been due to several factors:
• The Corps’ methodology has its origin in structural engineering applications,
where coefficients of variation alone are sufficient to model uncertainty from one
A9 ETL 11102556
28 May 99
member or component to another (i.e., media are not continuous and the
correlation structure need not be quantified).
• The Corps needed to rapidly implement a practical methodology, easily
understood and applied by practitioners; consideration of more advanced
techniques was deferred pending additional research.
• Methodology needed only to be sufficient to make reasonable comparisons of
reliability rather than calculate accurate values.
As the effect of introducing spatial correlation methodology is generally to reduce
variances, it could be said that it is consistent and conservative but technically incorrect to
perform probabilistic analysis without considering spatial correlation.
FirstOrder SecondMoment Reliability Methods
The primary approach in Corps guidance to date (e.g., U.S. Army Corps of Engineers
1992; 1993, 1995b) has been the use of FOSM methods. In this approach, the same as the
basis for structural design codes, uncertainty in performance is taken to be a function of
uncertainty in model parameters or in the model itself. The expected values and standard
deviations of the random variables (and sometimes model accuracy) are used to estimate
the expected value and standard deviation of a performance function, such as the factor of
safety against slope instability.
The Reliability Index. The usual output of FOSM methods is the reliability index, .
Given some performance function and limit state, the reliability index is the number of
standard deviations of the performance function by which the expected value of the
performance function exceeds the limit state. The concepts of FOSM methods and the
reliability index are illustrated in Figure 2.
E [φ]
parameter moments
f (φ) σφ
φ
E[ln FS]
f(ln FS) function moments
σ
ln FS integration slope stability model ln(FS) 0
βσ ln FS Figure 2. Method of moments  reliability index approach (after Wolff (1996a)) A10 ETL 11102556
28 May 99
The reliability index provides a measure of relative or comparative reliability without
having to assume a probability distribution for the performance function. A complete
distribution would be required to calculate the probability of failure, but its form is
generally unknown. The reliability index concept was popularized in structural code
development, to enable design of structural members to desired levels of relative
reliability, without knowing or having to assume probability distributions for the
performance functions. The concept of relative reliability is supported in early Corps
guidance (U.S. Army Corps of Engineers 1992), which states that the reliability index
values are “sufficiently accurate to rank the relative reliability of various structures and
components, but they are not absolute measures of reliability.” The same ETL suggested
that “Target reliability indices may be established for critical lock and dam components
and performance modes.”
A stepbystep description of FOSM methodology, working from random variables
through to , is given in Corps guidance (U.S. Army Corps of Engineers 1992, 1993,
1995b) and related research reports (Wolff and Wang 1992; Wolff 1994; Shannon and
Wilson, Inc., and Wolff 1994).
Probability of Failure or Unsatisfactory Performance. Although comparative
values would be sufficient to rank structures for repair, and target values would provide
decision strategy regarding what to repair, the Corps’ economic analysis methodology
requires probability values to permit full development of an event tree and probabilistic
modeling of economic consequences of unsatisfactory performance. In probabilistic
literature, the probability that the performance function is more adverse than the limit state
is termed the probability of failure Pr(f). However, some Corps guidance uses the term
probability of unsatisfactory performance Pr(U) to recognize the fact that the event under
consideration may not be catastrophic. To obtain Pr(f) or Pr(U) from , a probability
distribution on the performance function must be assumed. A normal distribution is
generally used for ease of calculation; however, the performance function is often then
taken as ln FS (or ln capacity/demand), implying that the factor of safety is lognormally
distributed. Given this assumption and the value of , the required probability values are
easily calculated from the properties of the assumed distribution.
Taylor’s Series Mean Value Method. To calculate , the moments of the
performance function must be calculated from the moments of the parameters. The most
common method used in Corps practice is the Taylor’s series method, based on a Taylor’s
series expansion of the performance function about the expected values. The expected
value of the performance function is obtained by evaluating the function using the
expected values of the parameters. The variance is obtained by summing the products of
the partial derivatives of the performance function (taken at the mean parameter values)
and the variances of the corresponding parameters. The detailed equations are given in
Corps guidance (U.S. Army Corps of Engineers 1992, 1995b), Wolff and Wang (1992,
1993), Shannon and Wilson, Inc., and Wolff (1994), and Wolff et al. (1996).
In Corps practice, the required partial derivatives are calculated numerically using an
increment of plus and minus one standard deviation, centered on the expected value. This
specific increment is unique to the Corps (numerical derivatives are often calculated using
very small increments), and was chosen to capture some of the behavior of nonlinear
functions even though the Taylor’s series method is exact only for linear functions. (For a
linear function, any increment will yield the same results). It also leads to computational
simplicity.
Point Estimate Method. An alternative method to the Taylor’s series method is the
point estimate method, developed by Rosenblueth (1975, 1981), and summarized by Harr
A11 ETL 11102556
28 May 99
(1987). It is also discussed more briefly in Corps guidance (U.S. Army Corps of
Engineers 1992, 1995b) and the related reference previously cited. In the point estimate
method, no calculations are made at the mean value, but rather the moments of the
performance function are determined by evaluating it at a set of combinations of high and
low parameter values, with the results weighted by factors. The point estimate method has
been less popular in practice because it requires more evaluations of the performance
function when the number of random variables exceeds two. However, it may better
capture the behavior of nonlinear functions. Some detailed comparisons of the two
methods for a number of real problems are given by Wolff and Wang (1992, 1993), Wolff
et al. (1996), and Wolff (1996a).
Hasofer  Lind Method. A potential problem with both the Taylor’s series method
and the point estimate method is their lack of invariance for nonlinear performance
functions. If a performance function and limit state can be expressed in more than one
equivalent way (e.g., Capacity / Demand = 1 or Capacity  Demand = 0), these two
functions will yield different values for the reliability index. Related problems are
computational difficulties in determining derivatives of very nonlinear functions such as
bearing capacity. For example, an example analysis in U.S. Army Corps of Engineers
(1993) uses only the mean values of rock strength parameters to circumvent this difficulty.
A more general definition of the reliability index, which is invariant and reduces to the
meanvalue definition for linear functions, was developed by Hasofer and Lind (1974). In
their method, the Taylor’s series is expanded, not about the mean or expected value, but
about an unknown point termed the failure point. An iterative solution is required.
Examples of the methodology are given by Ang and Tang (1985). Many published
analyses of geotechnical problems have not used the HasoferLind method, probably due
to its complexity, especially for implicit functions such as those in slope stability analysis.
The use of the meanvalue Taylor’s series method or the point estimate method, and
neglect of the invariance problem, introduces error of an unknown magnitude in
probabilistic analyses. The degree of error depends on the degree of nonlinearity in the
performance function and the coefficients of variation of the random variables.
Monte Carlo Simulation
An alternative means to estimate the expected value and standard deviation of the
performance function is the use of simulation methods, often referred to as Monte Carlo
methods or Monte Carlo simulation. In Monte Carlo simulation, values of the random
variables are generated in a fashion consistent with their probability distribution, and the
performance function is calculated for each generated set. The process is repeated
numerous times, typically thousands, and the expected value, standard deviation, and
probability distribution of the performance function are taken to match that of the
calculated values. Advantages of the Monte Carlo method include the following:
• It permits one to estimate the shape of the distribution on the performance
function, permitting more accurate estimation of probability values (however, see
disadvantages below).
• For explicit performance functions, it is easily programmed with simulation
software such as the Excel© addin @RISK©. A12 ETL 11102556
28 May 99
Disadvantages include the following:
• The shapes of the distributions on the random variables must be known or
assumed; hence the distribution obtained for the performance function is only
accurate to the extent that these are accurate.
• Accuracy of the estimated values is proportional to the square root of the number
of iterations; hence doubling the accuracy requires increasing the number of
iterations fourfold.
• Implicit functions requiring special programs (such as slope stability analysis)
require additional special programming for Monte Carlo analysis.
Despite these disadvantages, Monte Carlo analysis is likely to become increasingly
common in lieu of FOSM methods as computing capabilities continue to improve.
Some Comments on the Use and Meaning of or Pr(u)
Potential for Overlooking Some Performance Modes. A shortcoming of using only
FOSM or Monte Carlo methods in reliability analysis is the potential for overlooking some
performance modes. Christian (1996) notes that
The analyses leading to computed values of and pf can include
contributions from only those factors that the analyst has recognized
and incorporated into the calculations. If the analyst has ignored some
important factor, its contribution to the probability of failure will also
be ignored, and the computed value of pf will be correspondingly too
low. A great many slope failures have been found to be due to features
that were overlooked by the designers, or unanticipated factors
introduced during construction.
As FOSM or Monte Carlo methods require characterization of random variables and
selection of performance functions, emphasis may be given to those modes for which this
is easily done. The careful preparation of an event tree by a multidisciplinary team as the
first step in a risk analysis may alleviate this problem as it promotes consideration of all
possible unsatisfactory performance events, whether or not they are easily modeled by
random variables.
Physical Meaning of Probability of Failure for Existing Structures. The probability
of failure (or unsatisfactory performance) value for an existing structure presents
something of a philosophical paradox. As it is a transformation of the uncertainty in
parameter values to uncertainty in performance, its meaning for new structures could be
construed as follows:
Given that there is the specified uncertainty in parameter values before
construction, what is the probability that the value of the performance
function for the asconstructed structure will be to the adverse side of
the limit state?
Hence, the probability values from an FOSM analysis are implied to have a “per structure”
frequency. A probability of failure of 1 in 1000 could be construed to mean that, given
1000 similar structures constructed under independent, but statistically replicate
conditions, one failure would be expected upon first loading of the modeled condition. A13 ETL 11102556
28 May 99
For a stillexisting structure that has been subjected to a modeled load, it can obviously be
observed that the structure has not failed. Nevertheless, a probability of failure value can
be associated with that event. Hence, the probability of failure calculated for an existing
structure should be construed not as a contradiction of fact, but as a comparative measure
of reliability, suitable for judging the reliability of the structure and considered
performance mode relative to other structures or modes.
Lack of Time Dimension in FOSM Methods. It must be reemphasized that FOSM
methods and provide a measure of reliability with respect to a load event, but provide no
intrinsic information regarding lifetimes or timebased probabilities of failure or
unsatisfactory performance (U.S. Army Corps of Engineers 1995b). To achieve a timebased reliability analysis, some other random variable must have a time basis, such as the
load event considered (probability of occurrence per year), pool level or earthquake
acceleration (probability of occurrence per year), or some timerandom event (occurrence
of scour or initiation of a seep). FOSM methods can then be used to develop conditional
probabilities to follow the timebased antecedent event in the event tree.
Frequencybased Reliability Methods
In some circumstances, notably where data on actual lifetimes of components are more
accurate, more available and better understood than parameter uncertainty and
performance functions, and where it is desired to construct hazard functions, frequencybased reliability methods may be employed to advantage. This is the most common
approach used in designing mechanical, electrical, and electronic parts, for which it is
fairly easy to construct a number of replicate specimens and test them to failure. Such an
observational approach permits direct verification of the distribution of lifetimes without
resort to inferring them from more indirect approaches. For large civil engineering
structures, testing replicate specimens to failure is often out of the question, as structures
are unique and expensive.
A detailed treatment of lifetime distributions is provided by Lewis (1996), Nelson (1982),
and others. The methodology has been developed to considerable levels of sophistication,
although much is built on the Weibull distribution, which permits timevarying hazard
functions, and for which the exponential lifetime distribution of a Poisson process is a
special case.
The modeling of event frequency using the Weibull distribution fit to observed events was
reviewed in the methodology report prepared for the St. Louis District by Shannon and
Wilson, Inc., and Wolff (1994) and some examples were provided. An extended review of
the methodology for certain special cases was prepared by Wolff (1996b). These
techniques were used for certain aspects of the Upper Mississippi River study to develop
hazard functions for performance modes for which FOSM techniques are not easily
applied. They were also used to model the random occurrence of seepage incidents in the
Walter F. George dam study (U.S. Army Engineer District, Mobile 1997).
Subjectively Determined Probability Values
For some probability values required in an event tree, there may be neither sufficient
information (parameter variability and performance function) to employ FOSM methods
nor sufficient reliable historical data of similar events to employ frequencybased methods.
If it is necessary to develop conditional probability values for an event tree under these
circumstances, a final option is to estimate the values based solely on engineering
judgment. Although this may appear tantamount to guessing, there are established ways to A14 ETL 11102556
28 May 99
structure the estimation of such values by a panel of experts, moved toward a consensus in
an interactive and iterative exercise involving information sharing and feedback.
Although the use of expert elicitation in Corps’ studies has been limited (e.g. U.S. Army
Engineer District, Mobile 1997), some other agencies and entities owning dams have used
it more commonly than FOSM methods and values. The application of expert elicitation
to dam safety, with some reference to the methods and problems of establishing subjective
probability values has been discussed by Vick and Stewart (1996), who draw on more
general research on judgmental probability assessment by those in the behavioral sciences.
They state known problems with the process, such as overconfidence bias, motivational
bias, and problems with cognitive discrimination among extremely low probability values.
Both Vick and Stewart (1996) and VonThun (1996) provide case histories of such
analyses, the former for Canadian hydropower projects and the latter for a U.S. Bureau of
Reclamation project.
System Reliability
In some cases, it is necessary to establish the reliability of a system given the reliability of
its components. Solutions for simple parallel and series systems are given in Corps
guidance (U.S. Army Corps of Engineers 1992, 1995b). Solutions for more complex
systems can sometimes be obtained by reducing the system to combinations of series and
parallel systems. For some cases of complex and redundant systems, only bounds on the
reliability values can be obtained. System reliability is discussed in more detail in many of
the standard references cited.
For comparative economic analysis for Corps’ investment decisions, the issue of complex
systems has been approached by noting that the reliability of a few critical components
often governs the system. Hence, an analysis of such identified components has generally
been used as the basis for reliability analysis.
An example of simple systems reliability is given by Wolff (1994) for flood control levees.
In that report, it is assumed that the total probability of failure for a levee exposed to a
number of risks can be modeled assuming that the performance modes form an
independent series system.
Special Issues in Geotechnical Engineering
Some Unique Aspects in Geotechnical Problems. Some geotechnical engineering
problems have a number of unique aspects. These aspects include the following:
• In geotechnical engineering, coefficients of variation are related to the
variability of natural materials, which may need to be assessed on a sitespecific basis.
• Geotechnical parameters may have relatively high coefficients of variation (the
value for the coefficient of permeability may exceed 100 percent) and may be
correlated (e.g., c and ).
• Soil strength parameters can be defined and analyses performed in either a total
stress context or an effective stress context. In the former, the uncertainty in
strength and pore pressure are lumped; in the latter, they are treated separately.
• Soils are continuous media where properties vary from point to point, requiring
consideration of spatial correlation.
A15 ETL 11102556
28 May 99
• For problems such as slope stability, the location of the critical free body must
be searched out. Furthermore, its location varies with parameter values, and
varying parameter values (in an FOSM or Monte Carlo analysis) results in
different freebody locations for each set of parameter values.
• Although one slip surface may be “critical,” a slope can fail on any of an infinite
number of slip surfaces; hence a slope is a system of possible failure surfaces
which are correlated to some extent.
• Some earth structures such as levees may be exceedingly long, such as levees
which may be tens of miles long. These can be treated as a number of equivalent
independent structures; however, determining the appropriate length and number
is problematical, and the reliability of the system may be sensitive to the
assumptions made.
Complexities such as those cited above have slowed the adoption of probabilistic methods
in geotechnical engineering, both within and outside the Corps.
Strength Parameters from Triaxial Tests. The parameters c and measured from
triaxial tests are not measured uniquely on single samples, but are interpreted from the
results of several tests on replicate samples tested at different confining pressures. Hence,
the determination of probabilistic moments on c and from test data is not
straightforward. UrRasul (1995) considered eleven methods to do so. These are
summarized with recommendations by Wolff et al. (1995) and are briefly discussed in
Wolff (1996a).
Freebody and Critical Slip Surface Issues in Slope Stability Analysis. In slope stability
analysis, a large number of free bodies are systematically considered until a critical free
body is found which minimizes the factor of safety. This critical deterministic surface
may not coincide with the critical probabilistic surface. At least three approaches can and
have been considered in assigning a reliability index to a slope:
a. Take the reliability index as that for the critical deterministic surface.
b. For each combination of strength parameters considered in an FOSM or Monte
Carlo analysis, search the critical slip surface and use the factors of safety for this
set of mixed surfaces to calculate .
c. Generate candidate slip surfaces, calculate for each (varying strength
parameters while holding the surface geometry fixed), and systematically search
for the surface of minimum .
The first approach above will not, in general, provide a reasonable indication of the
reliability of a slope, as there may be other surfaces which give lower values.
The second approach, sometimes referred to as a floating surface, as is calculated from
results from a number of different surfaces, has been used in several studies, including
Appendix B to the ETL transmitting this Appendix (Wolff 1994), and Shannon and
Wilson, Inc., and Wolff (1994) as it is computationally convenient (results of UTEXAS3
analyses for different strength inputs can be used directly to calculate ) and was
considered to provide a measure of the reliability for the entire slope as a system.
However, it raises a philosophical issue regarding its meaning as the resulting value is
not associated with any single free body. As programs become available for the third
approach, it is recommended that it be followed. In the meantime sufficient surfaces
A16 ETL 11102556
28 May 99
should be analyzed to ensure that the surface of minimum reliability index has been
located as well as practicable.
Limited research on the third approach, published by Wolff et al. (1995) and further
investigated by Hassan (1996), indicates that calculating for surfaces of fixed geometry
and systematically searching for a fixed surface of minimum may locate surfaces with
significantly lower values than the preceding approaches.
Where the third approach is followed, the reliability index of a slope is commonly taken as
the value corresponding to the slip surface of minimum . However, a slope is a system
comprised of an infinite number of possible slip surfaces, each of which can fail, and each
with different . The resulting system is analogous to a large truss, which would have a
system reliability index lower than that of its critical member. The problem is further
complicated because closely spaced slip surfaces are highly correlated. The slip surface of
minimum i s in fact a lower bound on the value for the slope, which is not easily
determined.
Application of Spatial Correlation Theory to Slope Stability and Seepage Analysis.
As previously noted, soils are random fields (continuous media with spatially correlated
values). Where the correlation distance is shorter than the scale of the free body or cross
section analyzed in a stability or seepage analysis, parameter variances must be reduced to
represent the uncertainty in the average property over the considered cross section. A
more refined approach is to consider that individual slices in a stability analysis or
individual finite elements in a seepage analysis each have random parameter values that
are correlated with those of adjacent slices or elements. The required correlation
coefficients are related to geometric size of the elements and correlation structure of the
media. An introduction to spatial correlation issues is provided by Vanmarcke (1977a,
1977b). A summary and examples with additional references were provided for the St.
Louis District by Wolff (1996c). Neglecting spatial correlation, as is commonly the case
for Corps’ studies, implicitly assumes that the correlation distance is larger in dimension
than the considered section.
Application of Spatial Correlation Theory to Long Earth Structures. A second
consideration of spatial correlation is the natural variability of soil properties in the
direction normal to the twodimensional cross section analyzed. A slope stability or
seepage analysis made on a twodimensional section is assumed representative of some
unspecified length of embankment. However, a 1mile length of levee or embankment,
even on very uniform materials, is less reliable than a 100ft length of the same
embankment. To calculate the reliability of a long embankment as a series system,
analogous to a chain of independent links, a long section must be converted to a number of
statistically equivalent independent sections. This in turn may require more detailed
knowledge of the correlation structure than is generally available. The problem of slope
failures in long embankments has been considered by Vanmarcke (1977b).
Vrouwenvelder (1987) uses an upper and lower bound system reliability approach and a
correlation length of 500 m in an analysis of Dutch levee systems.
Examples of Probabilistic Analysis
A few examples of case histories of geotechnical probabilistic analyses and research
studies are briefly reviewed below to provide the reader a sense of the development of the
methodology and refer the reader to more detailed examples.
Wappapello Dam, St. Louis District. Wolff et al. (1988) reported an analysis by the
St. Louis District of the probability of an earthquakeinduced pool release at Wappapello
A17 ETL 11102556
28 May 99
Dam in southeastern Missouri. Although the dam is in a seismic area, it also has a
relatively high normal freeboard. The assessment combined the probability of foundation
liquefaction integrated over a range of possible earthquake magnitudes, the probability of
sliding given liquefaction, the probability distribution on slide scarp elevation given
sliding, and the probability of overtopping given slide scarp elevation and pool level.
Shelbyville Dam, St. Louis District. Wolff (1991) reported the results of
comparative probabilistic slope stability analysis for conditions before and after repair of
a slide at Shelbyville Dam. Using the point estimate method, it was demonstrated that
placement of a rock berm significantly reduced the probability of failure.
Research on Navigation Structures for Guidance Development. Wolff and Wang
(1992, 1993) published a set of example analyses and methodology comparisons based on
navigation structures on the Monongahela and TennesseeTombigbee river systems.
These included probabilistic characterization of soil and rock strength, comparison of the
Taylor’s series and point estimate methods to calculate , and an evaluation of the
improvement in reliability achieved by remedial action.
A later report by Shannon and Wilson, Inc., and Wolff (1994) for the St. Louis District
provided a series of examples for sliding and overturning analysis for gravity structures,
slope stability analysis, and various types of seepage analysis. In addition to providing
examples for calculating values, the report illustrated the fitting of Weibull distribution
to historical events to obtain hazard functions.
Research on Levees for Guidance Development. Wolff (1994) provided a set of
examples for levee reliability analysis considering a variety of failure modes. Conditional
probabilityofunsatisfactoryperformance functions were developed as functions of
floodwater elevations, and the resulting functions were combined assuming the various
modes form a simple series system. The complete report accompanies this ETL as
Appendix B.
Hodges Village Dam, New England Division. A probabilistic assessment of seepage
problems at the Hodges Village Dam was prepared by the New England Division (U.S.
Army Engineer Division, New England 1995). Hodges Village Dam is a normally dry
flood control dam built on very pervious sands and gravels without a cutoff. Residential
development is present adjacent to the toe of the dam. During past highwater events,
extensive seepage with damaging erosion has occurred, and the potential for a safety
problem at higher water levels was of concern. Although the nature of the problem
permitted a decision to remediate without a probabilistic assessment, one was performed
in support of the economic studies. Similar to the approach outlined in the levee research
described in the preceding paragraph, a stageexceedance probability function was used to
develop probability values for annual high pool elevations. The conditional probability of
exit gradients in excess of critical values, given pool level, was calculated using
probabilistic seepage analyses. The random variables in the analyses were the
permeability ratios of subsurface strata. A range of permeability ratios was determined
within which the seepage model could be calibrated to match past events; a probability
distribution on the “true permeability ratio” was fit to span that range.
Walter F. George Dam, Mobile District. A second risk analysis involving seepage
problems was performed for the Walter F. George Dam by the Mobile District (U.S. Army
Engineer District, Mobile 1997). Unlike the Hodges Village Dam, for which a finiteelement analysis could be performed to calculate gradients in pervious soils, seepage
through the foundation at the Walter F. George project occurs in solutioned limestone, and
uncontrolled seepage events have occurred at seemingly random locations on random
A18 ETL 11102556
28 May 99
occasions unrelated to pool level. These events have been repaired by exploring the lake
area for seepage inlets and plugging them with concrete or grout. Having no situation
readily amenable to analytical modeling, the risk assessment was performed using a
combination of frequencybased reliability methods fit to historical events and subjectively
determined probability values based on expert elicitation. Given the set of historical
events, annual probabilities of new events were taken to be increasing with time due to the
continued solutioning of the limestone. The expert panel estimated probability values for
future seeps occurring at various locations, for locating the source of the seep in sufficient
time, for being able to repair the seeps given that they are located, and for various
structural consequences of uncontrolled seepage.
Summary
Probabilistic methods are being used by the Corps of Engineers for risk analysis in support
of economic planning studies for project rehabilitation, and are being considered for other
applications. The framework of such risk analysis is an event tree, a pictorial
representation of a system of possible events and outcomes connected by conditional
probability values. The required probability values can be obtained by three approaches.
The first of these, based on parameter uncertainty and performance functions, has been the
most widely used to date. The second, based on fitting probability distributions to
historical events, has some advantages where knowledge of such events is more complete
than knowledge of parameter uncertainty and performance functions. The third approach,
subjective estimation of probability values by expert elicitation, has had only limited
application in the Corps, but has been used by some other agencies. Corps guidance, other
publications providing details of all three of these approaches, and example case histories
have been reviewed and a number of references have been provided to give the reader a
broad perspective on the state of risk analysis in geotechnical engineering.
Current Corps’ guidance for probabilistic analysis has a good experience record given the
short time frame it has been used and the rapid rate at which it was put into practice.
However, geotechnical engineering problems have a number of unique aspects not yet
fully treated in such guidance. Many of these center on the fact that soils and rock are
continuous media rather than discrete members, and the fact that soils and rock are natural
materials rather than constructed or manufactured materials. Notable among these are
characterization of strength parameters, spatial correlation considerations, and system
reliability of slopes. Additional refinements to the methodology will need to be developed
in the future as the need to perform risk analyses of geotechnical problems continues and
experience with the techniques is gained.
References
Ang, A. H.S., and Tang, W. H. (1975), Probability Concepts in Engineering Planning
and Design, Volume I: Basic Principles, John Wiley and Sons, New York.
Ang, A. H.S., and Tang, W. H. (1985), Probability Concepts in Engineering Planning
and Design, Volume II: Decision, Risk and Reliability, John Wiley and Sons, New
York.
Benjamin, J., and Cornell, C. A. (1970), Probability, Statistics and Decision for Civil
Engineers, McGrawHill, New York.
Christian, J.T. (1996), “Reliability Methods for Stability of Existing Slopes,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of A19 ETL 11102556
28 May 99
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 409418.
Christian, J. T., Ladd, C. C., and Baecher, G. B. (1992), “Reliability and Probability in
Stability Analysis,” Proceedings of Specialty Conference on Stability and
Performance of Slopes and Embankments  II, ASCE, Vol. 2, No. 12, pp. 10711111.
Christian, J. T., Ladd, C. C., and Baecher, G. B. (1994), “Reliability Applied to Slope
Stability Analysis,” Journal of Geotechnical Engineering, ASCE, Vol. 120, No. 12,
pp. 21802207.
DeGroot, D.J. (1996), “Analyzing Spatial Variability of Insitu Soil Properties,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 210238.
Ellingwood, B., Galambos, T. V., MacGregor, J., and Cornell, C. A. (1980),
“Development of a ProbabilityBased Load Criterion for National Bureau of
Standards A58,” NBS Special Publication 577, National Bureau of Standards,
Washington, DC.
Fenton, G. A. (1996), “Data Analysis / Geostatistics,” Chapter 4 of Probabilistic Methods
in Geotechnical Engineering, notes from workshop presented at ASCE Uncertainty
’96 Conference, Madison, WI, July 31, 1996, sponsored by ASCE Geotechncial
Safety and Reliability Committee, G. A. Fenton, ed.
Frudenthal (1947), “The Safety of Structures,” ASCE Transactions, Vol. 112, 1947, pp.
125159.
Gilbert, R. B. (1996), “Basic Random Variables,” Chapter 2 of Probabilistic Methods in
Geotechnical Engineering, notes from workshop presented at ASCE Uncertainty ’96
Conference, Madison, WI, July 31, 1996, sponsored by ASCE Geotechnical Safety
and Reliability Committee, G. A. Fenton, ed.
Hahn, G. J., and Shapiro, S. S. (1967), Statistical Models in Engineering, John Wiley and
Sons, New York.
Harr, M. E. (1987), ReliabilityBased Design in Civil Engineering, McGrawHill, New
York.
Hassan, A. (1996). Personal Communication.
Hasofer, A. A. and Lind, A. M. (1974), “An Exact and Invariant SecondMoment Code
Format,” Journal of the Engineering Mechanics Division, ASCE, Vol. 100, pp. 111121.
Kulhawy, F., and Trautman, C. H., (1996), “Estimation of InSitu Test Uncertainty,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 269286. A20 ETL 11102556
28 May 99
Lacasse, S. and Nadim, F. (1996), “Uncertainties in Characterizing Soil Properties,” in
Uncertainty in the Geologic Environment: From Theory to Practice, proceedings of
Uncertainty ’96, ASCE Geotechnical Special publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 4975.
Lewis, E. E. (1996), Introduction to Reliability Engineering, John Wiley and Sons, New
York.
Moses, F. and Verma, D. (1987), “Load Capacity Evaluation of Existing Bridges,”
National Cooperative Highway Research Program Report 310, Transportation
Research Board, National Research Council, Washington, DC.
National Research Council (1995), Probabilistic Methods in Geotechnical Engineering,
Nelson, W. (1982), Applied Life Data Analysis, John Wiley and Sons, NY.
Phoon, K. K. and Kulhawy, F. H. (1996), “On Quantifying Inherent Soil Variability,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp.326240.
Rosenblueth, E. (1975), “Point Estimates for Probability Moments,” Proceedings of the
National Academy of Science, USA, 72(10).
Rosenblueth, E. (1981), “TwoPoint Estimates in Probabilities,” Applied Mathematical
Modeling, 5.
Shannon and Wilson, Inc., and Wolff, T. F. (1994), “Probability Models for Geotechnical
Aspects of Navigation Structures,” report to the St. Louis District, U. S. Army Corps
of Engineers.
Tang, W.H. C. (1996), “Correlation, Multiple RV’s and System Reliability,” Chapter 3 of
Probabilistic Methods in Geotechnical Engineering, notes from workshop presented
at ASCE Uncertainty ’96 Conference, Madison, WI, July 31, 1996, sponsored by
ASCE Geotechncial Safety and Reliability Committee, G. A. Fenton, ed.
U.S. Army Corps of Engineers (1992), “Reliability Assessment of Navigation Structures,”
ETL 11102532, 1 May 1992.
U.S. Army Corps of Engineers (1993), “Reliability Assessment of Navigation Structures:
Stability Assessment of Gravity Structures,” ETL 11102321, 31 December 1995.
U.S. Army Corps of Engineers (1995a), “Reliability Assessment of PileFounded
Navigation Structures,” ETL 11102354, 31 August 1995.
U.S. Army Corps of Engineers (1995b), “Introduction to Probability and Reliability
Methods for Use in Geotechnical Engineering,” ETL 11102547, 30 September
1995.
U.S. Army Engineer Division, New England (1995), “Hodges Village Dam, Major
Rehabilitation Evaluation Report,” June 1995. A21 ETL 11102556
28 May 99
U.S. Army Engineer District, Mobile (1997), “Walter F. George Lock and Dam, Major
Rehabilitation Evaluation Report, Prevention of Potential Structural Failure,” March
1997.
UrRasul, I. (1995), “Characterizing Soil Strength for Probabilistic Analysis Working from
Test Results: A Practical Approach,” MS thesis, Michigan State University,
December 1995.
Vanmarcke, E. H. (1977a), “Probabilistic Modeling of Soil Profiles,” Journal of the
Geotechnical Engineering Division, ASCE, Vol 103, No. GT11, November 1977,
pp. 12271246.
Vanmarcke, E. H. (1977b), “Reliability of Earth Slopes,” Journal of the Geotechnical
Engineering Division, ASCE, Vol 103, No. GT11, November 1977, pp. 12471265.
Vick, S. G., and Stewart, R. A. (1996), “Risk Analysis in Dam Safety Practice,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 586601.
VonThun, J. L. (1996), “Risk Assessment of Nambe Falls Dam,” in Uncertainty in the
Geologic Environment: From Theory to Practice, Proceedings of Uncertainty ’96,
ASCE Geotechnical Special Publication No. 58, C.D. Shackelford, P.P Nelson, and
M.J.S. Roth, eds., pp. 604622.
Vrouwenvelder, A.C.W.M. (1987), “Probabilistic Design of Flood Defenses,” Report No.
B87404. IBBCTNO (Institute for Building Materials and Structures of the
Netherlands Organization of Applied Scientific Research), The Netherlands.
Whitman, R. V. (1996), “Organizing and Evaluating Uncertainty in Geotechnical
Engineering,” in Uncertainty in the Geologic Environment: From Theory to
Practice, Proceedings of Uncertainty ’96, ASCE Geotechnical Special Publication
No. 58, C.D. Shackelford, P.P Nelson, and M.J.S. Roth, eds., pp. 128.
Wolff, T. F. (1985), “Analysis and Design of Embankment Dam Slopes: A Probabilistic
Approach,” thesis submitted to the faculty of Purdue University in partial fulfillment
of the requirements for the degree of Doctor of Philosophy.
Wolff, T. F. (1987), “Slope Design for Earth Dams,” in Reliability and Risk Analysis in
Civil Engineering 2, Proceedings of the Fifth International Conference on
Applications of Statistics and Probability in Soil and Structural Engineering,
Vancouver, BC, Canada, pp. 725732.
Wolff, T. F. (1991), “Embankment Design Versus Factor of Safety: Before and After Slide
Repair,” International Journal of Numerical and Analytical Methods in
Geotechnical Engineering, Vol 15, No. 1, pp. 4150.
Wolff, T. F. (1994), “Evaluating the Reliability of Existing Levees,” research report
prepared for U.S. Army Engineer Waterways Experiment Station, Michigan State
University, September 1994.
Wolff, T. F. (1995), Probabilistic Methods in Engineering Analysis and Design, notes
from short course for the Jacksonville District, U.S. Army Corps of Engineers,
March 1995.
A22 ETL 11102556
28 May 99
Wolff, T. F. (1996a), “Probabilistic Slope Stability in Theory and Practice,” in
Uncertainty in the Geologic Environment: From Theory to Practice, Proceedings of
Uncertainty ’96, ASCE Geotechnical Special Publication No. 58, C.D. Shackelford,
P.P Nelson, and M.J.S. Roth, eds., pp. 419433.
Wolff, T. F. (1996b), “Application of Timebased Reliability Analysis to Corps of
Engineers’ Geotechnical Engineering Problems,” report prepared for St. Louis
District, U.S. Army Corps of Engineers, under subcontract to Shannon and Wilson,
Inc., St. Louis, MO, November 1996.
Wolff, T. F. (1996c), “Application of Spatial Averaging and Spatial Correlation to Corps
of Engineers’ Geotechnical Engineering Problems,” report prepared for St. Louis
District, U.S. Army Corps of Engineers, under subcontract to Shannon and Wilson,
Inc., St. Louis, MO, December 1996.
Wolff, T. F., Demsky, E. C., Schauer, J., and Perry, E. (1996), “Reliability Assessment of
Dam and Levee Embankments,” in Uncertainty in the Geologic Environment: From
Theory to Practice, Proceedings of Uncertainty ’96, ASCE Geotechnical Special
Publication No. 58, C.D. Shackelford, P.P Nelson, and M.J.S. Roth, eds., pp. 636650.
Wolff, T. F., Hassan, A., Khan, R., UrRasul, I., and Miller, M. (1995), “Geotechnical
Reliability of Dam and Levee Embankments, “ Report for the U.S. Army Engineer
Waterways Experiment Station, by Michigan State University, September 1995.
Wolff, T. F., Hempen, G. L., Dirnberger, M. M., and Moore, B. H. (1988), “Probabilistic
Analysis of EarthquakeInduced Pool Release, “ Second International Conference
on Case Histories in Geotechnical Engineering, University of Missouri  Rolla, pp.
787794.
Wolff, T. F., and Wang, W. (1992), “Engineering Reliability of Navigation Structures,”
research report, Michigan State University, for U.S. Army Engineer Waterways
Experiment Station, Vicksburg, MS.
Wolff, T. F., and Wang, W. (1993), “Reliability Analysis of Navigation Structures,” in
Geotechnical Practice in Dam Rehabilitation, Specialty Conference, Geotechnical
Engineering Division, ASCE, North Carolina State University, Raleigh, North
Carolina, ASCE Geotechnical Specialty Publication No. 35, pp. 159173.
Wu, T. H. (1996), “Events, Fault Trees, Bayes’ Theorem,” Chapter 1 of Probabilistic
Methods in Geotechnical Engineering, notes from workshop presented at ASCE
Uncertainty ’96 Confernce, Madison, WI, July 31, 1996, sponsored by ASCE
Geotechnical Safety and Reliability Committee, G. A. Fenton, ed. A23 ...
View
Full
Document
This note was uploaded on 07/11/2011 for the course MINING 340 taught by Professor Susenokramadibrata during the Spring '11 term at ITBA.
 Spring '11
 susenokramadibrata
 The Land

Click to edit the document details