Displaying 20 results from an estimated 2000 matches similar to: "Find the prediction or the fitted values for an lm model"
2007 Apr 16
2
Plotting data with a fitted curve
Suppose you have a vector of data in x and response values in y.  How
do you plot together both the points (x,y) and the curve that results
from the fitted model, if the model is not y ~ x, but a higher order
polynomial, e.g. y~poly(x,2)?  (In other words, abline doesn't work
for this case.)
Thanks,
     --Paul
-- 
Paul Lynch
Aquilent, Inc.
National Library of Medicine (Contractor)
2003 Feb 28
2
lattice and fitted function error
Platform:  WIN2000
Version of R:  1.6.2
I'm interested in plotting fitted values in a trellis xyplot.  I believe the
following should work; however, I only get the points (not the fitted
lines).
library(lattice)
trellis.device(bg="white")
 xyplot(MULTDV~TIME|SUBNUM,data=TEMP,
 panel=function(x,y){
 panel.xyplot(x,y)
 lines(x,fitted(lm(y~poly(x,1),na.action=na.omit)))
2010 Jul 21
3
Help me with prediction in linear model
Hi R-community,
I have the code as follows,i Fitted model as follows
lbeer<-log(beer_monthly)
t<-seq(1956,1995.2,length=length(beer_monthly)) #beer_monthly contains 400+
entries
t2=t^2
beer_fit_parabola=lm(lbeer~t+t2)
Below is not working for me.
Please help me in preparing the new data set for the below prediction
2006 Sep 28
2
safe prediction from lm
I am fitting a regression model with a bs term and then making predictions
based on the model. According to some info on the internet at
http://www.stat.auckland.ac.nz/~yee/smartpred/DummiesGuide.txt
there are some problems with using predict.lm when you have a model with
terms such as bs,ns,or poly. However when I used one of the examples they
said would illustrate the problems I get virtually
2011 Oct 04
1
Rug plot curve reversal
Dear R-help
Can anyone tell me why my curve appears the wrong way round on a rug plot?
I am using the same code as on pg 596 of the Crawley R-book.
mod<-glm(mort~logBd,binomial)
par(mfrow=c(2,2))
xv<-seq(0,8,0.01)
yv<-predict(mod,list(logBd=xv),type="response")
plot(logBd,mort)
lines(xv,yv)
I've tried swapping xv and yv around but no luck.
Thanks,
Pete 		 	   		  
2006 Jan 26
2
Prediction when using orthogonal polynomials in regression
Folks,
I'm doing fine with using orthogonal polynomials in a regression context:
  # We will deal with noisy data from the d.g.p. y = sin(x) + e
  x <- seq(0, 3.141592654, length.out=20)
  y <- sin(x) + 0.1*rnorm(10)
  d <- lm(y ~ poly(x, 4))
  plot(x, y, type="l"); lines(x, d$fitted.values, col="blue") # Fits great!
  all.equal(as.numeric(d$coefficients[1] + m
2007 Dec 28
1
logistic mixed effects models with lmer
I have a question about some strange results I get when using lmer to
build a logistic mixed effects model.  I have a data set of about 30k
points, and I'm trying to do backwards selection to reduce the number
of fixed effects in my model.  I've got 3 crossed random effects and
about 20 or so fixed effects.  At a certain point, I get a model (m17)
where the fixed effects are like this
2005 Feb 15
3
using poly in a linear regression in the presence of NA f ails (despite subsetting them out)
This smells like a bug to me.  The error is triggered by the line:
   variables <- eval(predvars, data, env)
inside model.frame.default().  At that point, na.action has not been
applied, so poly() ended being called on data that still contains missing
values.  The qr() that issued the error is for generating the orthogonal
basis when evaluating poly(), not for fitting the linear model itself.
2005 Feb 15
3
using poly in a linear regression in the presence of NA f ails (despite subsetting them out)
This smells like a bug to me.  The error is triggered by the line:
   variables <- eval(predvars, data, env)
inside model.frame.default().  At that point, na.action has not been
applied, so poly() ended being called on data that still contains missing
values.  The qr() that issued the error is for generating the orthogonal
basis when evaluating poly(), not for fitting the linear model itself.
2008 Mar 07
5
Puzzling coefficients for linear fitting to polynom
Hi,
I can not comprehend the linear fitting results of polynoms. For
example, given the following data (representing y = x^2):
> x <- 1:3
> y <- c(1, 4, 9)
performing a linear fit
> f <- lm(y ~ poly(x, 2))
gives weird coefficients:
> coefficients(f)
(Intercept) poly(x, 2)1 poly(x, 2)2 
  4.6666667   5.6568542   0.8164966 
However the fitted() result makes sense:
>
2004 Feb 03
5
lm coefficients
Dear R experts,
Excuse me if my question will be stupid...
I'd like to fit data with x^2 polynomial:
d <- read.table(file = "Oleg.dat", head = TRUE)
d
  X         T
  3720.00   4.113
  3715.00   4.123
  3710.00   4.132
  ...
out <- lm(T ~ poly(X, 4), data = d)
out
  Call:
  lm(formula = T ~ poly(X, 2), data = d)
  
  Coefficients:
  (Intercept)  poly(X, 2)1  poly(X, 2)2  
  
2007 Jan 25
1
poly(x) workaround when x has missing values
Often in practical situations a predictor has missing values, so that poly
crashes. For instance:
> x<-1:10
> y<- x -  3 * x^2 + rnorm(10)/3
> x[3]<-NA
> lm( y ~ poly(x,2) )
Error in poly(x, 2) : missing values are not allowed in 'poly'
>
> lm( y ~ poly(x,2) , subset=!is.na(x)) # This does not help?!?
Error in poly(x, 2) : missing values are not allowed in
2008 Apr 25
1
R-devel Digest, Vol 62, Issue 24
The columns of the model matrix are all orthogonal.  So the problem  
lies with poly(), not with lm().
 > x = rep(1:5,3)
y = rnorm(15)
z <- model.matrix(lm(y ~ poly(x, 12)))
x = rep(1:5,3)
 > y = rnorm(15)
 > z <- model.matrix(lm(y ~ poly(x, 12)))
 >  round(crossprod(z),15)
               (Intercept) poly(x, 12)1 poly(x, 12)2 poly(x, 12)3  
poly(x, 12)4
(Intercept)           
2017 Jul 16
2
How to formulate quadratic function with interaction terms for the PLS fitting model?
> On Jul 13, 2017, at 7:43 AM, Bert Gunter <bgunter.4567 at gmail.com> wrote:
> 
> Below.
> 
> -- Bert
> Bert Gunter
> 
> 
> 
> On Thu, Jul 13, 2017 at 3:07 AM, Luigi Biagini <luigi.biagini at gmail.com> wrote:
>> I have two ideas about it.
>> 
>> 1-
>> i) Entering variables in quadratic form is done with the command I
>>
2017 Jun 17
0
Prediction with two fixed-effects - large number of IDs
I have no direct experience with such horrific models, but your formula is a mess and Google suggests the biglm package with ffdf. 
Specifically, you should convert your discrete variables to factors before you build the model, particularly since you want to use predict after the fact, for which you will need a new data set with the exact same levels in the factors. 
Also, your use of I() is
2013 Apr 01
2
example to demonstrate benefits of poly in regression?
Here's my little discussion example for a quadratic regression:
http://pj.freefaculty.org/R/WorkingExamples/regression-quadratic-1.R
Students press me to know the benefits of poly() over the more obvious
regression formulas.
I think I understand the theory on why poly() should be more numerically
stable, but I'm having trouble writing down an example that proves the
benefit of this.
I
2007 Nov 07
3
Can I replace NA by 0 (if yes, how) ?
Hello,
I'm trying to fit some points with a 8-degrees polynom (result of lm is
stored in pfit).
In most of the case, it is ok but for some others, some coefficients are
"NA".
I don't really understand the meaning of these "NA".
And the problem is that I can't perform a derivation
(pderiv<-as.function((deriv(polynomial(pfit$coefficients))))) on pfit due to
the
2005 Jun 14
2
ordinary polynomial coefficients from orthogonal polynomials?
How can ordinary polynomial coefficients be calculated
from an orthogonal polynomial fit?
I'm trying to do something like find a,b,c,d from
  lm(billions ~ a+b*decade+c*decade^2+d*decade^3)
but that gives:  "Error in eval(expr, envir, enclos) :
Object "a" not found"
 > decade <- c(1950, 1960, 1970, 1980, 1990)
 > billions <- c(3.5, 5, 7.5, 13, 40)
 > #
2008 Apr 22
1
Bug in poly() (PR#11243)
Full_Name: Russell Lenth
Version: 2.6.2
OS: Windows XP Pro
Submission from: (NULL) (128.255.132.36)
The poly() function allows a higher-degree polynomial than it should, when
raw=FALSE.
For example, consider 5 distinct 'x' values, each repeated twice.  we can fit a
polynomial of degree 8:
=====
R> x = rep(1:5, 2)
R> y = rnorm(10)
R> lm(y ~ poly(x, 8))
Call:
lm(formula = y ~
2017 Jul 16
0
How to formulate quadratic function with interaction terms for the PLS fitting model?
??
If I haven't misunderstood, they are completely different!
1) NIR must be a matrix, or poly(NIR,...) will fail.
2) Due to the previously identified bug in poly, degree must be
explicitly given as poly(NIR, degree =2,raw = TRUE).
Now consider the following example:
> df <-matrix(runif(60),ncol=3)
> y <- runif(20)
> mdl1 <-lm(y~df*I(df^2))
> mdl2