Dear List,
I would like to take another chance and see if there if someone has
anything to say to my last post...
bump
servet
On 11/10/2010 01:11 PM, servet cizmeli wrote:> Hello,
>
> I have a basic question. Sorry if it is so evident....
>
> I have the following data file :
> http://ekumen.homelinux.net/mydata.txt
>
> I need to model Y~X-1 (simple linear regression through the origin) with
> these data :
>
> load(file="mydata.txt")
> X=k[,1]
> Y=k[,2]
>
> aa=lm(Y~X-1)
> dev.new()
> plot(X,Y,log="xy")
> abline(aa,untf=T)
> abline(b=0.0235, a=0,col="red",untf=T)
> abline(b=0.031, a=0,col="green",untf=T)
>
> Other people did the same kind of analysis with their data and found the
> regression coefficients of 0.0235 (red line) and 0.031 (green line).
>
> Regression with my own data, though, yields a slope of 0.0458 (black
> line) which is too high. Clearly my regression is too much influenced by
> the single point with high values (X>100). I would not like to discard
> this point, though, because I know that the measurement is correct. I
> just would like to give it less weight...
>
> When I log-transform X and Y data, I obtain :
>
> dev.new()
> plot(log10(X),log10(Y))
> abline(v=0,h=0,col="cyan")
> bb=lm(log10(Y)~log10(X))
> abline(bb,col="blue")
> bb
>
> I am happy with this regression. Now the slope is at the log-log domain.
> I have to convert it back so that I can obtain a number comparable with
> the literature (0.0235 and 0.031). How to do it? I can't force the
> second regression through the origin as the log-transformed data does
> not go through the origin anymore.
>
> at first it seemed like an easy problem but I am at loss :o((
> thanks a lot for your kindly help
> servet
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>