similar to: Issue with prediction from lm object with poly

Displaying 20 results from an estimated 3000 matches similar to: "Issue with prediction from lm object with poly"

2009 Jul 13
0
problem predict/poly
Dear R experts, I am observing undesired behavior of predict(fit, newdata), in case when fit object is produced by lm() involving a poly(). Here is how to reproduce: x <- c(1:10) y <- sin(c(1:10)) fit <- lm(formula=y~poly(x, 5, raw=TRUE)) predict(fit, newdata=data.frame(x=c(1:10))) ## this works predict(fit, newdata=data.frame(x=c(1:1))) ## this is broken, error below Error in poly(x,
2008 Apr 23
0
poly() can exceed degree k - 1 for k distinct points (PR#11251)
The poly() function can create more variables than can be fitted when there are replicated values. In the example below, 'x' has only 5 distinct values, but I can apparently fit a 12th-degree polynomial with no error messages or even nonzero coefficients: R> x = rep(1:5,3) R> y = rnorm(15) R> lm(y ~ poly(x, 12)) Call: lm(formula = y ~ poly(x, 12)) Coefficients:
2008 Apr 22
1
Bug in poly() (PR#11243)
Full_Name: Russell Lenth Version: 2.6.2 OS: Windows XP Pro Submission from: (NULL) (128.255.132.36) The poly() function allows a higher-degree polynomial than it should, when raw=FALSE. For example, consider 5 distinct 'x' values, each repeated twice. we can fit a polynomial of degree 8: ===== R> x = rep(1:5, 2) R> y = rnorm(10) R> lm(y ~ poly(x, 8)) Call: lm(formula = y ~
2012 Mar 14
0
using predict() with poly(x, raw=TRUE)
Dear r-devel list members, I've recently encountered the following problem using predict() with a model that has raw-polynomial terms. (Actually, I encountered the problem using model.frame(), but the source of the error is the same.) The problem is technical and concerns the design of poly(), which is why I'm sending this message to r-devel rather than r-help. To illustrate:
2002 Jul 03
0
poly.transform in R
Dear all, I am trying to transform polynomial coefficients from orthogonal form to the standard power basis. There's poly.transform in S-plus. Does anybody know how to do that in R ? I've found question about that in the archives of R-help but no real answer. Example : I'm doing polynomial regression of percentage of one insect in a community on altitude, precipitations,
2005 Feb 14
0
using poly in a linear regression in the presence of NA fails (despite subsetting them out)
I ran into a to me surprising result on running lm with an orthogonal polynomial among the predictors. The lm command resulted in Error in qr(X) : NA/NaN/Inf in foreign function call (arg 1) Error during wrapup: despite my using a "subset" in the call to get rid of NA's. poly is apparently evaluated before any NA's are subsetted out of the data. Example code (attached to
2007 Jan 25
1
poly(x) workaround when x has missing values
Often in practical situations a predictor has missing values, so that poly crashes. For instance: > x<-1:10 > y<- x - 3 * x^2 + rnorm(10)/3 > x[3]<-NA > lm( y ~ poly(x,2) ) Error in poly(x, 2) : missing values are not allowed in 'poly' > > lm( y ~ poly(x,2) , subset=!is.na(x)) # This does not help?!? Error in poly(x, 2) : missing values are not allowed in
2009 Nov 28
1
R function that duplicates Octave's poly function?
By any chance is anyone aware of an R function that duplicates Octave's poly function? Here is a description of Octave's poly function: Function File: poly (A) If A is a square N-by-N matrix, `poly (A)' is the row vector of the coefficients of `det (z * eye (N) - a)', the characteristic polynomial of A. As an example we can use this to find the eigenvalues
2008 Feb 13
1
use of poly()
Hi, I am curious about how to interpret the results of a polynomial regression-- using poly(raw=TRUE) vs. poly(raw=FALSE). set.seed(123456) x <- rnorm(100) y <- jitter(1*x + 2*x^2 + 3*x^3 , 250) plot(y ~ x) l.poly <- lm(y ~ poly(x, 3)) l.poly.raw <- lm(y ~ poly(x, 3, raw=TRUE)) s <- seq(-3, 3, by=0.1) lines(s, predict(l.poly, data.frame(x=s)), col=1) lines(s,
2005 Jun 29
1
poly() in lm() leads to wrong coefficients (but correct residuals)
Dear all, I am using poly() in lm() in the following form. 1> DelsDPWOS.lm3 <- lm(DelsPDWOS[,1] ~ poly(DelsPDWOS[,4],3)) 2> DelsDPWOS.I.lm3 <- lm(DelsPDWOS[,1] ~ poly(I(DelsPDWOS[,4]),3)) 3> DelsDPWOS.2.lm3 <- lm(DelsPDWOS[,1]~DelsPDWOS[,4]+I(DelsPDWOS[,4]^2)+I(DelsPDWOS[,4]^3)) 1 and 2 lead to identical but wrong results. 3 is correct. Surprisingly (to me) the residuals
2009 Dec 17
1
poly() with unnormalized values
How can I get the result of, e.g., poly(1:3. degree=2) to give me the unnormalized integer coefficients usually used to explain orthogonal polynomial contrasts, e.g, -1 1 0 -2 1 1 As I understand things, the columns of x^{1:degree} are first centered and then are normalized by 1/sqrt(col sum of squares), but I can't see how to relate this to what is returned by poly(). >
2006 Jun 13
1
poly(*,*) in lm() (PR#8972)
Full_Name: Jens Keienburg Version: 2.3.0 OS: Windows XP Submission from: (NULL) (193.174.53.122) I used the function lm() to calculate the coefficients of a polynome. If I used the function poly(t,2) to denote a polynome of form 1 + x + x^2, the coefficients are wrong. I appended an excerpt below: > t=1:100 > p=-20 - 10 * t + 2 * t^2 > p [1] -28 -32 -32 -28 -20 -8 8
2009 Jun 04
0
Dropping terms from regression w/ poly()
Hello r-help, I'm fitting a model with lm() and using the orthogonal polynomials from poly() as my basis: dat <- read.csv("ConsolidatedData.csv", header=TRUE) attach(dat) nrows <- 1925 Rad <- poly(Radius, 2) ntheta <- 14 Theta <- poly(T.Angle..deg., ntheta) nbeta <- 4 Beta <- poly(B.Beta..deg., nbeta) model.1 <- lm( Measurement ~ Block + Rad + Theta + Beta
2009 Jun 10
1
gpc.poly datatype
I have a list of polygons generated by the contourLines() command (each object of the list is a list in itself with two objects: a vector of x values, and a vector of y values for each vertex). I wish to convert that list into a gpc.poly object of multiple contours. How do I do this? gpclib apparently has no method of coercing lists into the gpc.poly object type. As well, when I have a
2002 Jan 12
2
Bug in predict(newdata=x) with poly() (PR#1258)
Bug in predict.lm & poly The predict function doesn't work when used with poly and newdata. For example, I'd expect the following code to work, and plot a fitted cubic to the nearly straight line: x <- 1:10 y <- x + rnorm(10)/100 plot(x,y) fit <- lm(y ~ poly(x,3)) newx <- seq(1,10,len=100) lines(newx,predict(fit,newdata=data.frame(x=newx))) However, the plotted
2009 Dec 22
2
use of lm() and poly()
Hi all, I want to fit data called "metal" with a polynominal function as dP ~ a.0 + a.1 * U0 + a.2 * U0^2 + a.3 * U0^3 + a.4 * U0^4 The data set includes, the independant variable U0 and the dependant variable dP. I've seen that the combination of lm() and poly() can do that instead of using the nls() function. But I don't get how to interpret the results from the linear
2002 Nov 25
1
Contr.poly for n > 100 (PR#2326)
Full_Name: David Clifford Version: Version 1.5.1 (2002-06-17) OS: Red Hat 7.3 Submission from: (NULL) (128.135.149.55) For n values above 100 there appears to be a bug in contr.poly(n). The contrast matrix should have rank n-1. Running the code below gives output (ie errors) at n=98, 100 and every value greater than 102. for(n in 2:150) { K <- contr.poly(n) rnk <-
2001 Jul 09
1
polynomial regression and poly
When doing polynomial regression I believe it is a good idea to use the poly function to generate orthogonal polynomials. When doing this in Splus there is a handy function (transform.poly I think) to convert the coefficients produced by regression with the poly function back to the original scale. Has somebody written something similar for R ? Robert
2011 Feb 03
3
interpret significance from the contr.poly() function
Hello R-help I don’t know how to interpret significance from the contr.poly() function . From the example below : how can I tell if data has a significant Linear/quadratic/cubic trend? > contr.poly(4, c(1,2,4,8))               .L         .Q          .C [1,] -0.51287764  0.5296271 -0.45436947 [2,] -0.32637668 -0.1059254  0.79514657 [3,]  0.04662524 -0.7679594 -0.39757328 [4,]  0.79262909 
2015 Jul 17
1
Improvements (?) in stats::poly and stats::polym.
Dear Keith, >>>>> <Keith.Jewell at campdenbri.co.uk> >>>>> on Thu, 16 Jul 2015 08:58:11 +0000 writes: > Dear R Core Team, > Last week I made a post to the R-help mailing list > ?predict.poly for multivariate data? > <https://stat.ethz.ch/pipermail/r-help/2015-July/430311.html> > but it has had no responses so I?m