Logistic Regression, Prediction and ROC

L1isml rbgmnape peitdgmnape rdcel1isml aueirdcel1isml

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: rsos" .) # # # FLE # AS # # 40 50 tbepeitcei.l1 tp ="epne)>02 al(rdc(rdtgm, ye rsos" .) # # # FLE TU # AS RE # # 47 30 10 3 tbepeitcei.l1 tp ="epne)>1-4 al(rdc(rdtgm, ye rsos" e0) # # # TU # RE # 40 # 50 In-sample and out-of-sample prediction In-sample (performance on training set) Suppose the cut-off probability is choosen as 0.2. The 2nd statement generates a logical vector (TRUE or FALSE) of whether each ob in training set has a fitted probability greater than 0.2. The 3rd statement transforms the logical vector to numeric (0 or 1). po.l1isml < peitcei.l1 tp = rbgm.nape - rdc(rdtgm, ye "epne) rsos" peitdgm.nape< po.l1isml >02 rdce.l1isml - rbgm.nape . peitdgm.nape< rdce.l1isml a.uei(rdce.l1isml) snmrcpeitdgm.nape Next we look at the confusion matrix, dnn is used to label the column and row: tbecei.ri$,peitdgm.nape dn= al(rdttanY rdce.l1isml, n c"rt" "rdce") (Tuh, Peitd) https://blackboar d.uc.edu/bbcswebdav/pid- 9566224- dt- content- r id- 55868231_...
View Full Document

This document was uploaded on 03/18/2014.

Ask a homework question - tutors are online