NASUG2005 - gologit2 Generalized Logistic Regression Partial Proportional Odds Models for Ordinal Dependent Variables Richard Williams Department of

NASUG2005 - gologit2 Generalized Logistic Regression...

This preview shows page 1 - 10 out of 33 pages.

gologit2: Generalized Logistic Regression/ Partial Proportional Odds Models for Ordinal Dependent Variables Richard Williams Department of Sociology University of Notre Dame July 2005
Image of page 1

Subscribe to view the full document.

Key features of gologit2 Backwards compatible with Vincent Fu’s original gologit program – but offers many more features Can estimate models that are less restrictive than ologit (whose assumptions are often violated) Can estimate models that are more parsimonious than non-ordinal alternatives, such as mlogit
Image of page 2
Specifically, gologit2 can estimate: Proportional odds models (same as ologit – all variables meet the proportional odds/ parallel lines assumption) Generalized ordered logit models (same as the original gologit – no variables need to meet the parallel lines assumption) Partial Proportional Odds Models (some but not all variables meet the pl assumption)
Image of page 3

Subscribe to view the full document.

Example 1: Proportional Odds Assumption Violated (Adapted from Long & Freese, 2003 – Data from the 1977 & 1989 General Social Survey) Respondents are asked to evaluate the following statement: “A working mother can establish just as warm and secure a relationship with her child as a mother who does not work.” 1 = Strongly Disagree (SD) 2 = Disagree (D) 3 = Agree (A) 4 = Strongly Agree (SA).
Image of page 4
Explanatory variables are yr89 (survey year; 0 = 1977, 1 = 1989) male (0 = female, 1 = male) white (0 = nonwhite, 1 = white) age (measured in years) ed (years of education) prst (occupational prestige scale).
Image of page 5

Subscribe to view the full document.

Ologit results . ologit warm yr89 male white age ed prst Ordered logit estimates Number of obs = 2293 LR chi2(6) = 301.72 Prob > chi2 = 0.0000 Log likelihood = -2844.9123 Pseudo R2 = 0.0504 ------------------------------------------------------------------------------ warm | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- yr89 | .5239025 .0798988 6.56 0.000 .3673037 .6805013 male | -.7332997 .0784827 -9.34 0.000 -.8871229 -.5794766 white | -.3911595 .1183808 -3.30 0.001 -.6231815 -.1591374 age | -.0216655 .0024683 -8.78 0.000 -.0265032 -.0168278 ed | .0671728 .015975 4.20 0.000 .0358624 .0984831 prst | .0060727 .0032929 1.84 0.065 -.0003813 .0125267 -------------+---------------------------------------------------------------- _cut1 | -2.465362 .2389126 (Ancillary parameters) _cut2 | -.630904 .2333155 _cut3 | 1.261854 .2340179 ------------------------------------------------------------------------------
Image of page 6
Interpretation of ologit results These results are relatively straightforward, intuitive and easy to interpret. People tended to be more supportive of working mothers in 1989 than in 1977. Males, whites and older people tended to be less supportive of working mothers, while better educated people and people with higher occupational prestige were more supportive. But, while the results may be straightforward, intuitive, and easy to interpret, are they correct? Are the assumptions of the ologit model met? The following Brant test suggests they are not.
Image of page 7

Subscribe to view the full document.

Brant test shows assumptions violated . brant Brant Test of Parallel Regression Assumption Variable | chi2 p>chi2 df -------------+-------------------------- All | 49.18 0.000 12 -------------+-------------------------- yr89 | 13.01 0.001 2 male | 22.24 0.000 2 white | 1.27 0.531 2 age | 7.38 0.025 2 ed | 4.31 0.116 2 prst | 4.33 0.115 2 ---------------------------------------- A significant test statistic provides evidence that the parallel regression assumption has been violated.
Image of page 8
How are the assumptions violated? . brant, detail Estimated coefficients from j-1 binary regressions y>1 y>2 y>3 yr89 .9647422 .56540626 .31907316 male -.30536425 -.69054232 -1.0837888 white -.55265759 -.31427081 -.39299842 age -.0164704 -.02533448 -.01859051 ed .10479624 .05285265 .05755466 prst -.00141118 .00953216 .00553043 _cons 1.8584045 .73032873 -1.0245168 This is a series of binary logistic regressions. First it is 1 versus 2,3,4; then 1 & 2 versus 3 & 4; then 1, 2, 3 versus 4
Image of page 9

Subscribe to view the full document.

Image of page 10

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask 0 bonus questions You can ask 0 questions (0 expire soon) You can ask 0 questions (will expire )
Answers in as fast as 15 minutes