Dear R-experts, Here below my R code working with quite a few warnings.? x11 and x12 are dichotomous variable (0=no and 1=yes). I substract 1 to ignore intercept. I would like not to ignore intercept. How to modify my R code because if I just remove -1 it does not work? y= c(32,45,65,34,23,43,65,76,87,98,7,867,56,45,65,76,88,34,55,66) x11=c(0,1,1,0,0,1,1,1,0,0,1,0,0,1,0,0,1,1,0,1) x12=c(0,1,0,1,0,1,1,0,1,1,0,0,1,1,1,0,0,1,0,0) ? Dataset=data.frame(y,x11,x12) ? a=lm(y~x11+x12-1,Dataset)$coef b=NULL for(i in c(1:2)) { ? f=formula(paste('y~',names(Dataset)[i],-1)) ? b=c(b,lm(f,Dataset)$coef) } coef=data.frame(rbind(a,b)) coef$Model=c('Multi','Single') library(reshape2) coef.long<-melt(coef,id.vars="Model") ? library(ggplot2) ggplot(coef.long,aes(x=variable,y=value,fill=Model))+ ? geom_bar(stat="identity",position="dodge")+ ? scale_fill_discrete(name="Model", ? labels=c("Multiple", "Simple"))+ ? labs(title =paste('La diff?rences des coefficients ? entre la r?gression multiple et simple'), ? x="Models",y="Coefficient")+ ? coord_flip() ? ? ?
Sigh... In a linear model with qualitative predictor variables, models with and without intercepts are just different parameterizations of the *same* model. -- they produce exactly the same predicted responses. So what do you mean? Search on "contrasts in linear models R" and similar for an explanation. Cheers, Bert On Tue, Feb 21, 2023, 13:34 varin sacha via R-help <r-help at r-project.org> wrote:> Dear R-experts, > > Here below my R code working with quite a few warnings. > x11 and x12 are dichotomous variable (0=no and 1=yes). I substract 1 to > ignore intercept. > I would like not to ignore intercept. How to modify my R code because if I > just remove -1 it does not work? > > > y= c(32,45,65,34,23,43,65,76,87,98,7,867,56,45,65,76,88,34,55,66) > x11=c(0,1,1,0,0,1,1,1,0,0,1,0,0,1,0,0,1,1,0,1) > x12=c(0,1,0,1,0,1,1,0,1,1,0,0,1,1,1,0,0,1,0,0) > > Dataset=data.frame(y,x11,x12) > > a=lm(y~x11+x12-1,Dataset)$coef > b=NULL > for(i in c(1:2)) { > f=formula(paste('y~',names(Dataset)[i],-1)) > b=c(b,lm(f,Dataset)$coef) > } > coef=data.frame(rbind(a,b)) > coef$Model=c('Multi','Single') > library(reshape2) > coef.long<-melt(coef,id.vars="Model") > > library(ggplot2) > ggplot(coef.long,aes(x=variable,y=value,fill=Model))+ > geom_bar(stat="identity",position="dodge")+ > scale_fill_discrete(name="Model", > labels=c("Multiple", "Simple"))+ > labs(title =paste('La diff?rences des coefficients > entre la r?gression multiple et simple'), > x="Models",y="Coefficient")+ > coord_flip() > > > > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >[[alternative HTML version deleted]]
Not sure what you are trying to do here. The immediate issue is that you are getting 'y' on the RHS, because that is the 1st column in Dataset. So "for (i in 2:3)" might be closer to intention. However, a 0/1 regresson with no intercept implies that the mean for the "0" group is zero, and with two regressors that the mean is zero for the (0,0) group. Looking at the data, this is quite clearly not the case. I suppose you may have intended to fit the models _with_ the intercept and then _ignore_ the intercept for plotting purposes, i.e. lm(y~x11+x12, Dataset)$coef[-1], etc.? (Also, I suspect that you don't actually have y=7 and y=867 in the dataset.) -pd> On 21 Feb 2023, at 22:33 , varin sacha via R-help <r-help at r-project.org> wrote: > > Dear R-experts, > > Here below my R code working with quite a few warnings. > x11 and x12 are dichotomous variable (0=no and 1=yes). I substract 1 to ignore intercept. > I would like not to ignore intercept. How to modify my R code because if I just remove -1 it does not work? > > > y= c(32,45,65,34,23,43,65,76,87,98,7,867,56,45,65,76,88,34,55,66) > x11=c(0,1,1,0,0,1,1,1,0,0,1,0,0,1,0,0,1,1,0,1) > x12=c(0,1,0,1,0,1,1,0,1,1,0,0,1,1,1,0,0,1,0,0) > > Dataset=data.frame(y,x11,x12) > > a=lm(y~x11+x12-1,Dataset)$coef > b=NULL > for(i in c(1:2)) { > f=formula(paste('y~',names(Dataset)[i],-1)) > b=c(b,lm(f,Dataset)$coef) > } > coef=data.frame(rbind(a,b)) > coef$Model=c('Multi','Single') > library(reshape2) > coef.long<-melt(coef,id.vars="Model") > > library(ggplot2) > ggplot(coef.long,aes(x=variable,y=value,fill=Model))+ > geom_bar(stat="identity",position="dodge")+ > scale_fill_discrete(name="Model", > labels=c("Multiple", "Simple"))+ > labs(title =paste('La diff?rences des coefficients > entre la r?gression multiple et simple'), > x="Models",y="Coefficient")+ > coord_flip() > > > > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.-- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Office: A 4.23 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com