{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ohdatamineDISC2 - DATA MINING Susan Holmes Stats202 Lecture...

Info icon This preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
. . . . . . DATA MINING Susan Holmes © Stats202 Lecture 15 Fall 2010 A B a b c d f g h i e j kl
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
. . . . . . Special Announcements I All requests should be sent to [email protected] . I Homework, the deadline is Tuesday 5.00pm, all hw not within the deadline is rejected (we have an automatic system). Please don't forget to add your sunet id to your hw file name (at the end). I Midterm, you can bring a one page cheatsheet, no cellphones, no laptops.
Image of page 2
. . . . . . Last Time:Alternative Classification Methods I Rule Based. I Instance Based Methods and Nearest Neighbors (knn). Today: Discriminant Analysis: for continuous explanatory variables only.
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
. . . . . . Discrimination for Continuous Explanatory Variables Discriminant functions are the essence of the output from a discriminant analysis. Discriminant functions are the linear combinations of the standardised independent variables which yield the biggest mean differences between the groups. If the response is a dichotomy(only two classes to be predicted) there is one discriminant function; if the reponse variable has k levels(ie there are k classes to predict), up to k-1 discriminant functions can be extracted, and we can test how many are worth extracting.
Image of page 4
. . . . . . Discriminant Functions Successive discriminant functions are orthogonal to one another, like principal components, but they are not the same as the principal components you would obtain if you just did a principal components analysis on the independent variables, because they are constructed to maximise the differences between the values of the response, not the total variance, but the variance between classes. The initial input data do not have to be centered or standardized before the analysis as is the case in principal components, the outcome of the final discriminant analysis will not be affected by the scaling.
Image of page 5

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
. . . . . . Discriminant Functions A discriminant function, also called a canonical root, is a latent variable which is created as a linear combination of discriminating (independent) variables, such that L = b 1 x 1 + b 2 x 2 + ... + b p x p + c, where the b's are discriminant coefficients, the x's are discriminating variables, and c is a constant. This is similar to multiple regression, but the b's are discriminant coefficients which maximize the distance between the means of the criterion (dependent) variable. Note that the foregoing assumes the discriminant function is estimated using ordinary least-squares, the traditional method, but there is also a version involving maximum likelihood estimation.
Image of page 6
. . . . . . Least Squares Method of estimation of Discriminant Functions The variance covariances matrix can be decomposed into two parts: one is the variance within each class and the other the variability between clases, or we can decompose the sum of squares and cross products (the same up to a constant factor) T = B + W T = X ( I n - P 1 n ) X B = X ( Pg - P 1 n ) X between-class W = X ( I n - Pg ) X within I n is the identity matrix. P1n is the orthogonal projection in the space 1 n . (i.e. P 1 n = 1 n 1 n / n ). Such that ( I n - P 1 n ) X is the matrix of centered cases.
Image of page 7

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern