Displaying 20 results from an estimated 8000 matches similar to: "polynomial regression and poly"
2008 Feb 13
1
use of poly()
Hi,
I am curious about how to interpret the results of a polynomial regression--
using poly(raw=TRUE) vs. poly(raw=FALSE).
set.seed(123456)
x <- rnorm(100)
y <- jitter(1*x + 2*x^2 + 3*x^3 , 250)
plot(y ~ x)
l.poly <- lm(y ~ poly(x, 3))
l.poly.raw <- lm(y ~ poly(x, 3, raw=TRUE))
s <- seq(-3, 3, by=0.1)
lines(s, predict(l.poly, data.frame(x=s)), col=1)
lines(s,
2008 Mar 07
5
Puzzling coefficients for linear fitting to polynom
Hi,
I can not comprehend the linear fitting results of polynoms. For
example, given the following data (representing y = x^2):
> x <- 1:3
> y <- c(1, 4, 9)
performing a linear fit
> f <- lm(y ~ poly(x, 2))
gives weird coefficients:
> coefficients(f)
(Intercept) poly(x, 2)1 poly(x, 2)2
4.6666667 5.6568542 0.8164966
However the fitted() result makes sense:
>
2007 Feb 12
1
How to get the polynomials out of poly()
Hi Folks!
Im using the function poly to generate orthogonal polynomials, but Id like
to see the actual polynomials so that I could convert it to a polynomial
in my original variable. Is that possible and if so how do I do it?
/E
2005 Jun 29
1
poly() in lm() leads to wrong coefficients (but correct residuals)
Dear all,
I am using poly() in lm() in the following form.
1> DelsDPWOS.lm3 <- lm(DelsPDWOS[,1] ~ poly(DelsPDWOS[,4],3))
2> DelsDPWOS.I.lm3 <- lm(DelsPDWOS[,1] ~ poly(I(DelsPDWOS[,4]),3))
3> DelsDPWOS.2.lm3 <-
lm(DelsPDWOS[,1]~DelsPDWOS[,4]+I(DelsPDWOS[,4]^2)+I(DelsPDWOS[,4]^3))
1 and 2 lead to identical but wrong results. 3 is correct. Surprisingly
(to me) the residuals
2008 Jul 01
1
Orthogonal polynomials and poly
Dear All,
I have found in the poly help this sentence:
The orthogonal polynomial is summarized by the coefficients, which can be
used to evaluate it via the three-term recursion given in Kennedy & Gentle
(1980, pp. 343–4), and used in the predict part of the code.
My question: which type of orthogonal polynomials are used by this function?
Hrmite, legendre..
TIA
Giovanni
[[alternative HTML
2004 May 06
5
Orthogonal Polynomial Regression Parameter Estimation
Dear all,
Can any one tell me how can i perform Orthogonal
Polynomial Regression parameter estimation in R?
--------------------------------------------
Here is an "Orthogonal Polynomial" Regression problem
collected from Draper, Smith(1981), page 269. Note
that only value of alpha0 (intercept term) and signs
of each estimate match with the result obtained from
coef(orth.fit). What
2003 Jun 23
1
precision matrix for polynomial growth curves
What does the warning message
"1: Singular precision matrix in level -1, block 1" mean?
I get this warning 50+ times when I try to fit the following
model
lme( response ~ covariateA + poly(covariateB,3), ~poly(covariateB,3)|group )
It's not a small dataset - a set of up to 20 blood pressure
readings on just over 2000 people, and I don't get the error
message when I try to fit
2003 Jan 16
2
polynomial contrasts in R
In S-Plus, I can obtain polynomial contrasts for an ordered factor with
contr.poly(). The function also exists in R, however is limited to factors
where the levels are equally spaced. In S-Plus, one can obtain the contrasts
for a set of numeric values representing unequally spaced ordered factors.
Has anyone implemented this in R? I see that the S-Plus function calls
another function (poly.raw())
2005 Jun 14
2
ordinary polynomial coefficients from orthogonal polynomials?
How can ordinary polynomial coefficients be calculated
from an orthogonal polynomial fit?
I'm trying to do something like find a,b,c,d from
lm(billions ~ a+b*decade+c*decade^2+d*decade^3)
but that gives: "Error in eval(expr, envir, enclos) :
Object "a" not found"
> decade <- c(1950, 1960, 1970, 1980, 1990)
> billions <- c(3.5, 5, 7.5, 13, 40)
> #
2013 Apr 01
2
example to demonstrate benefits of poly in regression?
Here's my little discussion example for a quadratic regression:
http://pj.freefaculty.org/R/WorkingExamples/regression-quadratic-1.R
Students press me to know the benefits of poly() over the more obvious
regression formulas.
I think I understand the theory on why poly() should be more numerically
stable, but I'm having trouble writing down an example that proves the
benefit of this.
I
2006 May 27
1
Recommended package nlme: bug in predict.lme when an independent variable is a polynomial (PR#8905)
Full_Name: Renaud Lancelot
Version: Version 2.3.0 (2006-04-24)
OS: MS Windows XP Pro SP2
Submission from: (NULL) (82.239.219.108)
I think there is a bug in predict.lme, when a polynomial generated by poly() is
used as an explanatory variable, and a new data.frame is used for predictions. I
guess this is related to * not * using, for predictions, the coefs used in
constructing the orthogonal
2008 Aug 19
1
Polynomial regression help
I have a simple X, Y data frame that I am trying to run regression analysis
on. The linear regression looks great, but when I use lm(formula = y ~
poly(x, degree = 5)) I get the same coeffecients. So for example if I use
degree =3 my formula would look like y = 4.2 x^3 + 3.2x^2 + 2.1x + 1.0 and
my degree 5 would look like y = 6.5x^5+ 5.4x^4 + 4.2 x^3 + 3.2x^2 + 2.1x +
1.0, which doesn't make
2009 Dec 22
2
use of lm() and poly()
Hi all,
I want to fit data called "metal" with a polynominal function as dP ~ a.0 +
a.1 * U0 + a.2 * U0^2 + a.3 * U0^3 + a.4 * U0^4
The data set includes, the independant variable U0 and the dependant
variable dP.
I've seen that the combination of lm() and poly() can do that instead of
using the nls() function.
But I don't get how to interpret the results from the linear
2009 Nov 28
1
R function that duplicates Octave's poly function?
By any chance is anyone aware of an R function that duplicates Octave's poly function?
Here is a description of Octave's poly function:
Function File: poly (A)
If A is a square N-by-N matrix, `poly (A)' is the row vector of
the coefficients of `det (z * eye (N) - a)', the characteristic
polynomial of A. As an example we can use this to find the
eigenvalues
2011 Jul 07
1
Polynomial fitting
Hello,
i'm fairly familiar with R and use it every now and then for math related
tasks.
I have a simple non polynomial function that i would like to approximate
with a polynomial. I already looked into poly, but was unable to understand
what to do with it. So my problem is this. I can generate virtually any
number of datapoints and would like to find the coeffs a1, a2, ... up to a
given
2007 Jan 25
1
poly(x) workaround when x has missing values
Often in practical situations a predictor has missing values, so that poly
crashes. For instance:
> x<-1:10
> y<- x - 3 * x^2 + rnorm(10)/3
> x[3]<-NA
> lm( y ~ poly(x,2) )
Error in poly(x, 2) : missing values are not allowed in 'poly'
>
> lm( y ~ poly(x,2) , subset=!is.na(x)) # This does not help?!?
Error in poly(x, 2) : missing values are not allowed in
2003 Apr 29
1
polynomial fitting
I'm trying to find a way to fit a polynomial of degree n in x and y to
a set of x, y, and z data that I have and obtain the coefficients for
the terms of the fitted polynomial. However, when I try to use the
surf.ls function I'm getting odd results.
> x <- seq(0, 10, length=50)
> y <- x
> f <- function (x, y) {x^2 + y}
> library(spatial)
> test <-
2008 Apr 22
1
Bug in poly() (PR#11243)
Full_Name: Russell Lenth
Version: 2.6.2
OS: Windows XP Pro
Submission from: (NULL) (128.255.132.36)
The poly() function allows a higher-degree polynomial than it should, when
raw=FALSE.
For example, consider 5 distinct 'x' values, each repeated twice. we can fit a
polynomial of degree 8:
=====
R> x = rep(1:5, 2)
R> y = rnorm(10)
R> lm(y ~ poly(x, 8))
Call:
lm(formula = y ~
2013 Apr 27
2
Polynomial Regression and NA coefficients in R
Hey all,
I'm performing polynomial regression. I'm simulating x values using runif() and y values using a deterministic function of x and rnorm().
When I perform polynomial regression like this:
fit_poly <- lm(y ~ poly(x,11,raw = TRUE))
I get some NA coefficients. I think this is due to the high correlation between say x and x^2 if x is distributed uniformly on the unit interval
2006 Jun 13
1
poly(*,*) in lm() (PR#8972)
Full_Name: Jens Keienburg
Version: 2.3.0
OS: Windows XP
Submission from: (NULL) (193.174.53.122)
I used the function lm() to calculate the coefficients of a polynome. If I used
the function poly(t,2) to denote a polynome of form 1 + x + x^2, the
coefficients are wrong. I appended an excerpt below:
> t=1:100
> p=-20 - 10 * t + 2 * t^2
> p
[1] -28 -32 -32 -28 -20 -8 8