Hi, I am puzzled by the relationship between the p-values asociated with the coefficients of a univariate logistic regression involving categorical variables and the p-value I get from Fisher's exact test of the associated 2 x 2 contingency table. (1) The 2-sided p-value for the table is ~ 0.0015, whereas the p-value for the independent is 0.101 and the p-value for the intercept is ~0.56. (2) I know the the coefficient for the independent is the ln(OR) (assuming I recall correctly). Most accounts I've seen ignore the intercept, say it's hard to interpret, or give an interpretation (with respect to the dataset as a single group) that I can't square with the interpretation of beta[1]. Thanks for help for redirection. -- Roy Wilson Learning Research Development Center University of Pittsburgh webpage: www.pitt.edu/~rwilson email: rwilson at pitt.edu
Peter Dalgaard
2005-Feb-03 17:23 UTC
[R] Logistic regression coef. Was: If this is should be posted elsewhere, please advise
Well, in principle there are general stats lists and newsgroups, and this is not specifically an R question. However, people around here have been known to contribute information about statistical substance from time to time. You're still not excused for not using a meaningful subject line though... roy wilson <rwilson+ at pitt.edu> writes:> Hi, > > I am puzzled by the relationship between the p-values asociated with > the coefficients of a univariate logistic regression involving > categorical variables and the p-value I get from Fisher's exact test > of the associated 2 x 2 contingency table. > > (1) The 2-sided p-value for the table is ~ 0.0015, whereas the p-value > for the independent is 0.101 and the p-value for the intercept is > ~0.56.The immediate conjecture is that you have become yet another victim of the Hauck-Donner effect (an exact reference was given a week or so ago, so use the list archives). The Wald tests given by dividing coefficients with their formal s.e. can be badly misleading if you are far from satisfying the requirements for asymptotic theory. This is especially bad in cases where the OR is 0 or infinite so that the glm() fit is divergent. You might try drop1(yourfit,test="Chisq") and see if the likelihod ratio test for the independent is not at least somewhat closer to the Fisher test.> > (2) I know the the coefficient for the independent is the ln(OR) > (assuming I recall correctly). Most accounts I've seen ignore the > intercept, say it's hard to interpret, or give an interpretation (with > respect to the dataset as a single group) that I can't square with the > interpretation of beta[1].In R it is (normally) the log-odds of the baseline group. That is, provided you are using treatment contrasts.> Thanks for help for redirection. > > > > -- > Roy Wilson > Learning Research Development Center > University of Pittsburgh > webpage: www.pitt.edu/~rwilson > email: rwilson at pitt.edu > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >-- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907