In a previous post, I mapped out the relationship between the inverse logit and logistic function. In this post, I'll present the likelihood function followed by the log likelihood.
Since the likelihood is expressed in terms of frequencies (according to the text I'm referencing, "Biostatistical Methods: The Assessment of Relative Risks" by John Lachin), consider the following 2x2 table where the Response is some dependent variable, the Group is a binary independent variable (e.g. exposure), and the cell values denote the frequency in each cell. The marginal totals are represented by m1, m2, n1, n2. The grand total is N.
Group


1

2


Response

+

a $(\pi_1)$

b $(\pi_2)$

m1



c $(1  \pi_1)$

d $(1  \pi_2)$

m2


n1

n2

N

The generic likelihood function, $L(\theta)$, is "the total probability of the sample under the assumed model" (pp. 465), denoted thus:
$L(y_1, \cdots , y_N; \theta) = \prod_{i=1}^N f(y_i; \theta)$
If we express the generic likelihood function in terms of the frequencies from the table above (a, b, c, d) then the likelihood function becomes
$L(\pi_1, \pi_2) = \pi^a_1 (1\pi_1)^c \pi^b_2 (1\pi_2)^d$
by the fact that the cell probabilities, $\pi_i$, are exponentiated by the number of subjects in each cell (a, b, c, d).
The log likelihood is just the log of the above:
$\ell(\pi_1, \pi_2) = a\:log(\pi_1) + c\:log(1\pi_1) + b\:log(\pi_2) + d\:log(1\pi_2)$
Since we'll eventually want to derive the maximum likelihood estimates for $\alpha$ and $\beta$, the log likelihood should be expressed in terms of $\alpha$ and $\beta$ (the substitutions for $\pi_i$ follow from the inverse logit to logistic post):
$\ell(\theta) = a\:log \Bigl[\frac{e^{\alpha + \beta}}{1 + e^{\alpha + \beta}}\Bigr] + c\:log \Bigl[\frac{1}{1 + e^{\alpha + \beta}}\Bigr] + b\:log \Bigl[\frac{e^{\alpha}}{1 + e^{\alpha}}\Bigr] + d\:log \Bigl[\frac{1}{1 + e^{\alpha}}\Bigr]$
Expanding the above (per logarithmic properties), we get
$\ell(\theta) = a\:log\:e^{\alpha + \beta}  a\:log(1 + e^{\alpha + \beta})  c\:log(1 + e^{\alpha + \beta}) + b\:log\:e^{\alpha}  b\:log\:(1 + e^{\alpha})  d\:log(1 + e^{\alpha})$
Simplifying and combining terms we get
$\ell(\theta) = a(\alpha + \beta) + b\:\alpha  (a + c)log(1 + e^{\alpha + \beta})  (b + d)log(1 + e^{\alpha})$
$\ell(\theta) = (a + b)\alpha + a\:\beta  (n_1)log(1 + e^{\alpha + \beta})  (n_2)log(1 + e^{\alpha})$
After one more substitution $(a + b = m_1)$, the log likelihood function is as follows, expressed in terms of the frequencies and marginal totals from the 2x2 table.
$\ell(\theta) = (m_1)\alpha + a\:\beta  (n_1)log(1 + e^{\alpha + \beta})  (n_2)log(1 + e^{\alpha})$
With the log likelihood in this form, the score functions for $\alpha$ and $\beta$ can then be derived and the maximum likelihood estimates obtained (planned for a future blog post).
No comments:
Post a Comment