search for: mylm

Displaying 20 results from an estimated 37 matches for "mylm".

Did you mean: mlm
2004 Mar 08
2
getting the std errors in the lm function
Hello, I have a simple question for you: making: mylm<-lm(y~x) summary(mylm) I get the following results: ****************************************************** Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 16.54087 0.19952 82.91 <2e-16 *** x[1:19] -2.32337 0.04251 -54.66 <2e-16 *** *******...
2010 Sep 14
1
NA confusion (length question)
Hi folks, I am running a very simple regression using mylm <- lm(mass ~ tarsus, na.action=na.exclude) I would like the use the residuals from this analysis for more regression but I'm running into a snag when I try cbind(mylm$residuals, mydata) # where my data is the original data set The error tells me that it cannot use cbind because...
2003 Oct 11
1
Subclassing lm
I'd trying to subclass the "lm" class to produce a "mylm" class whose instances behave like lm objects (are accepted by methods like summary.lm) but have additional data or slots of my own design. For starters: setClass("mylm", "lm") produces the somewhat cryptic: Warning message: Old-style (``S3'') class "mylm&q...
2012 Sep 06
0
lme( y ~ ns(x, df=splineDF)) error
...<-IDeffect[as.character(longdat$ID)] longdat$y<- (longdat$x + longdat$x^3 + (longdat$x-1)^4 / 5 + 1/(abs(longdat$x/50) + 0.02) + longdat$IDeffect + rnorm(1:nrow(longdat)) * 2 ) longdat<-longdat[order(longdat$x),] library(splines) # Calling ns within lm works fine: mylm<- lm( y ~ ns(x,df=splineDF), data=longdat) longdat$lmfit<-predict(mylm) library(ggplot2) print( ggplot(longdat, aes(x, y)) + geom_point(shape=1) + geom_line(aes(x=x, y=lmfit), color="red") ) cat("Enter to attempt lme.") readline() library(nlme) if(WhichApproach==1...
2012 Sep 26
0
lme(y ~ ns(x, df=splineDF)) error
...Deffect[as.character(longdat$ID)] longdat$y<- (longdat$x + longdat$x^3 + (longdat$x-1)^4 / 5 + 1/(abs(longdat$x/50) + 0.02) + longdat$IDeffect + rnorm(1:nrow(longdat)) * 2 ) longdat<-longdat[order(longdat$x),] library(splines) # Calling ns within lm works fine: mylm<- lm( y ~ ns(x,df=splineDF), data=longdat) longdat$lmfit<-predict(mylm) library(ggplot2) print( ggplot(longdat, aes(x, y)) + geom_point(shape=1) + geom_line(aes(x=x, y=lmfit), color="red") ) cat("Enter to attempt lme.") readline() library(nlme) if(WhichApproach...
2011 May 20
2
extraction of mean square value from ANOVA
...ating values and then using an ANOVA table to find the mean square value. I would like to form a loop that extracts the mean square value from ANOVA in each iteration. Below is an example of what I am doing. a<-rnorm(10) b<-factor(c(1,1,2,2,3,3,4,4,5,5)) c<-factor(c(1,2,1,2,1,2,1,2,1,2)) mylm<-lm(a~b+c) anova(mylm) Since I would like to use a loop to generate this several times it would be helpful to know how to extract the mean square value from ANOVA. Thanks [[alternative HTML version deleted]]
2009 Mar 31
1
using "substitute" inside a legend
Hello list, I have a linear regression: mylm = lm(y~x-1) I've been reading old mail postings as well as the plotmath demo and I came up with a way to print an equation resulting from a linear regression: model = substitute(list("y"==slope%*%"x", R^2==rsq), list(slope=round(mylm$coefficients[[1]],2),rsq=round(summary(...
2009 May 14
1
automated polynomial regression
...look like follows: > true_y <- c(1:50)*.8 > # the real values > m_y <- c((1:21)*1.1, 21.1, 22.2, 23.3 ,c(25:50)*.9)/0.3-5.2 > # the measured data > x <- c(1:50) > # and the x-axes > > # Now I do the following: > > m_y_2 <- m_y^2 > m_y_3 <- m_y^3 > mylm <- lm(true_y ~ m_y + m_y_2 + m_y_3) ; mylm Call: lm(formula = true_y ~ m_y + m_y_2 + m_y_3) Coefficients: (Intercept) m_y m_y_2 m_y_3 1.646e+00 1.252e-01 2.340e-03 -9.638e-06 Now I can get the real result with > calibration <- 1.646e+00 + 1.252e-01*m_y +...
2009 Jan 24
2
how to prevent duplications of data within a loop
...3 <- rnorm(50) myData <- data.frame(response,var1,var2,var3) var.names <- names(myData)[2:4] numVars <- length(var.names) betas <- rep(-1,numVars) names(betas) <- var.names #run regression on var1 through var3. for (Var_num in 1:numVars) { col.name <- var.names[Var_num] mylm <- lm(response ~ get(col.name),data=myData,model=FALSE) betas[Var_num] <- coef(mylm)[2] }
2007 Apr 06
2
lm() intercept at the end, rather than at the beginning
Hi, I wonder if someone has already figured out a way of making summary(mylm) # where mylm is an object of the class lm() to print the "(Intercept)" at the last line, rather than the first line of the output. I don't know about, say, biostatistics, but in economics the intercept is usually the least interesting of the parameters of a regression model. That...
2006 Apr 23
1
lme: null deviance, deviance due to the random effects, residual deviance
...and $deviance elements as in glm objects... I tried to find out an answer on R-help & Pineihro & Bates (2000). Partial success only: - null deviance: Response: possibly yes: see http://tolstoy.newcastle.edu.au/R/help/05/12/17796.html (Spencer Graves). The (null?) deviance is -2*logLik(mylme), but a personnal trial with some glm objects did not led to the same numbers that the one given by the print.glm method... - the deviance due to the the random effect(s). I was supposing that the coefficients given by ranef(mylme) may be an entry... but beyond this, I guess those coefficient...
2006 Jan 16
4
Standardized beta-coefficients in regression
Hello list, I am used to give a lot of attention to the standardized regression coefficients, which in SPSS are listed automatically. Is there alternative to running the last two lines in the following example to get all the information? ctl <- c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14) trt <- c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69) summary( lm(ctl ~ trt) )
2005 Jun 16
1
regressing each column of a matrix on all other columns
...<- matrix(round(runif(10*3,1,10),0),10) A for (i in 1:length(A[1,])) B[,i] <- as.matrix(predict(lm( A[,i] ~ A[,-i] ))) B It works fine, but I need it to be faster. I've looked at *apply but just can't seem to figure it out. Maybe the solution could look somewhat like this: mylm <- function(y,ci) { x <- A[,-ci] b <- lm(y~x) } B <- apply(A,2,mylm,ci=current_column_index(A)) Is there a way to pass the index of the current column in apply to my function? Am I on the right path at all? Thanks for your help. Regards, Stefan
2007 Oct 30
1
Some matrix and sandwich questions
...gression, matrices, and sandwich package. 1) In the sandwich package, I would like to better understand the meat() function. >From the bread() documentation, for a simple OLS regression, bread() returns (1/n * X'X)^(-1) That is, for a simple regression (per the documentation on bread()): MyLM <- lm(y ~ x) bread(MyLM) solve(crossprod(cbind(1, x))) * length(y) (The last two terms above produce the same output, the matrix described above.) In terms of the basic data matrix and coefficients, what does meat() return for a simple OLS regression? (I don't know the term "empirical...
2006 Apr 25
5
Heteroskedasticity in Tobit models
Hello, I've had no luck finding an R package that has the ability to estimate a Tobit model allowing for heteroskedasticity (multiplicative, for example). Am I missing something in survReg? Is there another package that I'm unaware of? Is there an add-on package that will test for heteroskedasticity? Thanks for your help. Cheers, Alan Spearot -- Alan Spearot Department of Economics
2012 May 11
0
contrasts with an imbalance in a factor
...0L)) ## this is what I really want to be able to do, but without contrast ## complaining about the imbalance in the number of rows cat("### With scanner\n") fixedFormula=as.formula("fmri ~ group * task + scanner") randomFormula = as.formula("random = ~ 1 | subject") mylme = lme(fixed=fixedFormula, random=randomFormula, data=my.model) ## now look at the risky choices (40outcome and 80outcome) versus the ## safe choices (20outcome) con=contrast( mylme, a=list(group="Group1", task=c("40outcome", "80outcome"), scanner=levels(my.model$...
2007 Nov 20
1
plotting confidence intervals of regression line
Hello, I am trying to generate a confidence interval (90 or 95%) of a regression line. This is primarily just for illustration on a scatter plot (i.e. I am trying to make this http://www.ast.cam.ac.uk/~rgm/scratch/statsbook/graphics/anima4.gif). I have been trying to use the predict.lm function, with interval set as "confidence", but this still seems to be giving me a prediction
2004 Mar 02
2
Some timings for 64 bit Opteron (ATLAS, GOTO, std)
...20, rpois(n, lam= 9))), > f3 = factor(pmin(32, rpois(n, lam= 12)))) > with(ldat, > ldat$y <<- 10 + 4*x1 + 2*x2 + rnorm(n) + > ## no rounding here: > + 10 * rnorm(nlevels(f1))[f1] + > + 100* rnorm(nlevels(f2))[f2]) > str(ldat) > > mylm <- lm(y ~ .^2, data = ldat) > proc.time() ## (~= 100 sec on P4 1.6 GHz "lynne") > str(mm <- model.matrix(mylm)) > smlm <- summary(mylm) > > p1 <- predict(mylm) > p2 <- predict(mylm, type = "terms") > > str(myim <- influence.measures(m...
2012 Apr 18
0
Error in eval when using contrast and nlme
...will be " , inNumberOfOutputBriks, " stats briks\n") outStats <- vector(mode="numeric", length=inNumberOfOutputBriks) ##cat (inData, "\n") if ( ! all(inData == 0 ) ) { ##try( ##if( inherits( print(inModelFormula) print(inRandomFormula) mylme <- lme(fixed=inModelFormula, random=inRandomFormula, data = inModel) print(mylme) ##, ## silent=FALSE), ##"try-error") ) { ##temp <- 0 ## cat (paste("Error on slice", inZ, "\n")) ##} else { temp <- as.vector(unlis...
2009 Sep 20
3
plotting least-squares regression against x-axis
Hi, I want to plot the residuals of a least-squares regression. plot(lm(y~x), which=1) does this, but it plots the y-axis of my data on the x-axis of the residuals plot. That is, it plots the residual for each y-value in the data. Can I instead use the x-axis of my data as the x-axis of the residuals plot, showing the residual for a given x? Thanks! Jason Priem University of North