similar to: R-squared in lm

Displaying 20 results from an estimated 7000 matches similar to: "R-squared in lm"

2003 Jan 22
2
small bug in binom.test?
Hi all, I am wondering whether there is a small bug in the binom.test function of the ctest library (I'm using R 1.6.0 on windows 2000, but Splus 2000 seems to have the same behaviour). Or perhaps I've misunderstood something. the command binom.test(11,100,p=0.1) and binom.test(9,100,p=0.1) give different p-values (see below). As 9 and 11 are equidistant from 10, the mean of the
2006 Nov 05
1
diag()<- in Matrix?
Dear all, I am trying to use the Matrix package to do some calculations on rather large and sparse matrices. An example of such a matrix is given below. mig<-0.2 side<-10 np<-side^2 mig.mat<-matrix(0,np,np) diag(mig.mat[1:(np-side),(side+1):np])<-mig/4 diag(mig.mat[(side+1):np,1:(np-side)])<-mig/4 diag(mig.mat[-np,-1])<-mig/4
2002 May 17
1
split-plot design?
Dear R-gurus, We are planning an experiment to test if plants produced by selfing are less fit than those produced by outcrossing. We have plants from three different alpine valleys, picked randomly among all the possible valleys. In each valley, we have a number of individuals, also picked at random. seeds from this individuals were brought back to the green house and sawned. when they
2009 Jul 15
2
Differing Variable Length Inconsistencies in Random Effects/Regression Models
Dear All, I am quite new to R and am having a problem trying to run a linear model with random effects/ a regression- with particular regard to my variable lengths being different and the models refusing to compute any further. The codes I have been using are as follows: vc<-read.table("P:\\R\\Testvcomp10.txt",header=T) >> attach(vc) > > family<-factor(family) >
2011 Apr 11
1
Meta-analysis of a correlation matrix
Sorry for the cross-posting, but I would like to know if anyone is aware of a package in R for this. ---------- Forwarded message ---------- From: John Antonakis Sent: Sunday, April 10, 2011 3:26 PM To: RMNET Subject: Meta-analysis of a correlation matrix (correct thread title) Hi: Does anyone know of good program that can do a meta-analytic multiple regression (with multiple correlated
2008 Apr 29
2
Legend problem when exporting a plot to PDF
Hi list, When exporting to PDF a graph with a legend, in the final PDF, the text is going beyond the legend box. > dev2bitmap("test.pdf", type="pdfwrite", h=6, w=6) The legend looks OK on the screen. I noticed that the size of the legend box depends on the size of the screen window, which is not the case for other graphical parts (text of the legend, title, axis
2010 Mar 30
2
simple loop iteration
Hi R mailing list, probably a very basic problem here, I try to do the following: > Q<-c(1,2,3) > P<-c(4,5,6) > A<- data.frame(Q,P) > A Q P 1 1 4 2 2 5 3 3 6 this is my simplified data.frame (matrix) now I try to create following loop for subtraction of element within the data.frame: > for(i in length(A[,"P"]-1){ delta[i]<-
2012 May 04
0
ur.df funtion
Dear R users, I am applying the augmented-Dickey-Fuller Unit Root Test (ur.df function of the urca package) to a time series of approximately 50 values. To be sure I understood what was going on with the ur.df function, I checked the critical values of the 3 test statistics (tau, phi2 and phi3 if a trend is included) or the 2 test statistics (tau and phi1 if only a drift is included) with the
2010 May 29
0
plotting density in same plot in loop iteration
Hi R-mailing list I would have the following set-up below with a simplified data-frame. Through a loop which includes certain criteria for the densities I would like to plot the different density-distributions in the same plot. Of course I hope I don't do any mistakes with all the indexes of the dataframe. All I would like to have is the different densities in the same plot with a general
2010 Sep 29
1
Fitting a half-ellipse curve
Dear mailing list, I have following array: X2 Y2 [1,] 422.7900 6.0 [2,] 469.8007 10.5 [3,] 483.9428 11.0 [4,] 532.4917 25.5 [5,] 596.1942 33.5 [6,] 630.8496 40.5 [7,] 733.2996 45.0 [8,] 946.4779 32.0 [9,] 996.8068 35.5 [10,] 1074.3310 23.0 I do afterwards the following: plot.new() plot.window(xlim=c(min(X1)-50,max(X1)+50),
2011 Jun 30
1
Error "singular gradient matrix at initial parameter estimates" in nls
Greetings, I am struggling a bit with a non-linear regression. The problem is described below with the known values r and D inidcated. I tried to alter the start values but get always following error message: Error in nlsModel(formula, mf, start, wts): singular gradient matrix at initial parameter estimates Calls: nls -> switch -> nlsModel I might be missing something with regard to the
2006 Aug 25
1
R.squared in Weighted Least Square using the Lm Function
Hello all, I am using the function lm to do my weighted least square regression. model<-lm(Y~X1+X2, weight=w) What I am confused is the r.squared. It does not seem that the r.squared for the weighted case is an ordinary 1-RSS/TSS. What is that precisely? Is the r.squared measure comparable to that obtained by the ordinary least square? <I also notice that model$res is the unweighted
2004 Jul 22
1
Bug: wrong R-squared in lm formula w/o intercept (PR#7127)
Full_Name: Adriano Azevedo Filho Version: 1.9.1 OS: Windows, Linux Submission from: (NULL) (200.171.246.212) R-squared and Adjusted R-squared appear to be wrong when the formula in lm() is specified without intercept. Problem present in both Windows and Linux 1.9.1 version. Also in the 1.8.1 version for Windows (other versions not checked). Possible example which reproduces the problem:
2011 Mar 04
1
linear model - lm (Adjusted R-squared)?
Hi, Sorry for the naive question, but what exactly does the 'Adjusted R-squared' coefficient in the summary of linear model adjust for? Sample code: > x <- rnorm(15) > y <- rnorm(15) > lmr <- lm(y~x) > summary(lmr) Call: lm(formula = y ~ x) Residuals: Min 1Q Median 3Q Max -1.7828 -0.7379 -0.4485 0.7563 2.1570 Coefficients:
2005 Apr 18
1
R-squared in summary(lm...)
What is the difference between the two R-squareds returned for a linear regression by summary(lm...)? When might one report multiple vs. adjusted R-squared? Thank you, Ben Osborne -- Botany Department University of Vermont 109 Carrigan Drive Burlington, VT 05405 benjamin.osborne at uvm.edu phone: 802-656-0297 fax: 802-656-0440
2010 Oct 05
2
R squared for lm prediction
Hi all, I have used a hold out sample to predict a model but now I want to compute an R squared value for the prediction. Any help is appreciated. Best regards -- View this message in context: http://r.789695.n4.nabble.com/R-squared-for-lm-prediction-tp2955328p2955328.html Sent from the R help mailing list archive at Nabble.com.
2016 Apr 08
0
R.squared in summary.lm with weights
On 07/04/2016 5:21 PM, Murray Efford wrote: > Following some old advice on this list, I have been reading the code for summary.lm to understand the computation of R-squared from a weighted regression. Usually weights in lm are applied to squared residuals, but I see that the weighted mean of the observations is calculated as if the weights are on the original scale: > > [...] > f
2011 Sep 08
2
Extract r.squared using cbind in lm
Hello, I am using cbind in a lm-model. For standard lm-models the r.squared can be easily extracted with summary(model)$r.squared, but that is not working in in the case with cbind. Here an example to illustrate the problem: a <- c(1,3,5,2,5,3,1,6,7,2,3,2,6) b <- c(12,15,18,10,18,22,9,7,9,23,12,17,13) c <- c(22,26,32,33,32,28,29,37,34,29,30,32,29) data <- data.frame(a,b,c)
2010 Jan 22
4
Extract R-squared from summary of lm
Dear all, I cannot find to explicitly get the R-squared or adjusted R-squared from summary(lm()) Thanks a lot! [[alternative HTML version deleted]]
2016 Apr 07
0
R.squared in summary.lm with weights
Do you mean w <- z$residuals ? Type names(z) to see the list of item in your model. I ran your code on a lm and it work fine. You don't need the brackets around mss <- Michael Long On 04/07/2016 02:21 PM, Murray Efford wrote: > Following some old advice on this list, I have been reading the code for summary.lm to understand the computation of R-squared from a weighted