Tuesday, October 23, 2012

Ordinal Regression Probability Ratio

One of the statistical techniques I've proposed to use in my dissertation proposal is ordinal regression (also known as ordered logit, proportional odds model, and cumulative odds model).  This regression model is a direct extension of the binary logistic model except that instead of modeling the probability of a binary event (e.g. alive/dead), you are modeling the probability of an inequality with three or more naturally ordered events (e.g. mild/modest/severe).  Consider the binary logit model:
$\frac{Pr(y = 1 \vert {\boldsymbol{X=x}})}{Pr(y = 0 \vert {\boldsymbol{X=x}})}$
Now consider the ordinal logit model:
$ \frac{P(Y \geq y_i \vert {\boldsymbol{X=x}})}{1 - P(Y \geq y_i \vert {\boldsymbol{X=x}})} = \frac{P(Y \geq y_i \vert {\boldsymbol{X=x}})}{P(Y < y_i \vert {\boldsymbol{X=x}})} $  
(If interested in some background concerning the logit model, you can find a mapping of the inverse logit to the logistic model here, the logit model likelihood function here, and the logit model maximum likelihood estimates here.)

By definition (and derivation) we have this,
$ Pr(Y \geq y_i \vert {\boldsymbol{X=x}}) = \frac{1}{1 + e^{-(\alpha_i + {\boldsymbol{x'_i \beta}})}} $
and this,
$ Pr(Y < y_i \vert {\boldsymbol{X=x}}) = \frac{1}{1 + e^{\alpha_i + {\boldsymbol{x'_i \beta}}}} $

Since the ordinal regression model is the ratio of the two probabilities immediately preceding, the following is obtained after substitution and some minor manipulation:
$ \frac{P(Y \geq y_i \vert {\boldsymbol{X=x}})}{P(Y < y_i \vert {\boldsymbol{X=x}})}  = \frac{1 + e^{\alpha_i + {\boldsymbol{x\beta'}}}}{1 + e^{-(\alpha_i + {\boldsymbol{x\beta'}})}} $
which then reduces to $ e^{\alpha_i + {\boldsymbol{x\beta'}}} $.  

When I was reviewing this a couple of weeks ago, though, I wasn't able to remember how the ratio of the two exponential expressions reduced to the single exponential expression.  Embarrassingly, I mentioned it to my wife (a wicked smart chick with an applied math background) and she thought about it for all of ten minutes then scribbled the solution onto a piece of newspaper.  Although not obvious to me then, the solution seems so obvious to me now:

$ \frac{1 + e^{\alpha_i + {\boldsymbol{x\beta'}}}}{1 + e^{-(\alpha_i + {\boldsymbol{x\beta'}})}} \frac{e^{\alpha_i + {\boldsymbol{x\beta'}}}}{e^{\alpha_i + {\boldsymbol{x\beta'}}}} = \frac{e^{\alpha_i + {\boldsymbol{x\beta'}}}(1 + e^{\alpha_i + {\boldsymbol{x\beta'}}})}{e^{\alpha_i + {\boldsymbol{x\beta'}}}(1 + e^{-(\alpha_i + {\boldsymbol{x\beta'}})})} = \frac{e^{\alpha_i + {\boldsymbol{x\beta'}}}(1 + e^{\alpha_i + {\boldsymbol{x\beta'}}})}{(1 + e^{\alpha_i + {\boldsymbol{x\beta'}}})} = e^{\alpha_i + {\boldsymbol{x\beta'}}} $

No comments:

Post a Comment