Discriminant%20Analysis%20Lecture%20

Discriminant%20Analysis%20Lecture%20 - Discriminant...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Discriminant Analysis Discrimination and classification are multivariate techniques concerned with separating distinct sets of observations (objects) and with allocating new observations (objects) to previously defined groups. Goal of discrimination is to describe, either graphically (in three or fewer dimensions) or algebraically, the differential features of observations (objects) from several known populations. We try to find “discriminants” whose numerical values are such that the collections are separated as much as possible. Te classification into one or several populations is a multivariate problem in that for each experimental unit a number of possibly correlated responses have been collected. A single response out of these, by itself, may not describe the population sufficiently adequately, and what may be needed is to produce a single index or a well-defined criterion that can be used as a classification rule. Since none of such decisions rules are expected to be perfect, there will always be certain probabilities of misclassifications. These are either to be controlled or to be minimized. Discrimination for two multivariate normal populations A likelihood Rule Choose when 1 Π ) ; ( ) ; ( 2 2 1 1 Σ > Σ µ x L x L The linear discriminant function rule (Fisher’s approach) Choose if 1 Π 0 ' > k x b Choose 2 Π otherwise Discriminant Analysis Lecture .doc - 1 -
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
The Mahalanobis distance rule. ) ˆ ( ˆ ) ˆ ( 1 ' i i i x x d µ Σ = Choose when 1 Π 2 1 d d < Posterior probability rule. + = Π i i i d d d x P 2 1 2 1 exp 2 1 exp 2 1 exp ) ( Choose if 1 Π ) ( ) ( 2 1 x P x P Π > Π Discriminant Analysis Lecture .doc - 2 -
Background image of page 2
Sample Discriminant Rules Let’s assume two populations and 1 Π 2 Π with 2 1 2 1 , , , Σ Σ µ unknown parameters estimated by 2 1 2 1 ˆ , ˆ , ˆ , ˆ Σ Σ Pooled 2 ˆ ) 1 ( ˆ ) 1 ( 2 1 2 2 1 1 + Σ + Σ = Σ N N N N The Linear Discriminant Function Rule (Fisher’s approach): The function is a linear decision rule in (i.e. a linear combination of the components of ). This linear combination of the variables would produce “maximally different” Discriminant scores among groups. x b ' ˆ x x x Also let’s say x b y ' ˆ ˆ = Where ) ˆ ˆ ( ˆ ˆ 2 1 1 Σ = b Allocate to if 1 Π 0 ˆ ' > k x b and choose 2 Π otherwise ) ˆ ˆ ( ˆ ) ˆ ˆ ( 2 1 ) ˆ ˆ ( 2 1 2 1 1 ' 2 1 2 1 ' + Σ = + = b k ) ( 2 1 2 1 y y k + = Discriminant Analysis Lecture .doc - 3 -
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
To find the linear combination that produces “maximally different” scores , Fisher proposed choosing he linear combination to maximize the ratio of the between group sum of squares (between samples variability) relative to the within group sum of squares (within sample variability) of the discriminant scores. y
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/26/2010 for the course CPSC 499 taught by Professor Staff during the Spring '08 term at University of Illinois, Urbana Champaign.

Page1 / 44

Discriminant%20Analysis%20Lecture%20 - Discriminant...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online