Displaying 20 results from an estimated 5000 matches similar to: "Dropping terms from regression w/ poly()"
2009 May 07
1
Using lme() for split plot
Hi,
I'm trying to figure out how to use lme() for analyzing a split-plot
experiment. I've been looking at the examples from the 'R Book',
those are nested but with only one factor at the whole-plot level, my
test is 2^2 at the whole-plot level, with a single many level factor
at the sub-plot level. My question is about properly specifying the
random effects part of the model,
2012 Jan 05
2
Bayesian estimate of prevalence with an imperfect test
Hi all!
I'm new to this forum so please excuse me if I don't conform perfectly to
the protocols on this board!
I'm trying to get an estimate of true prevalence based upon results from an
imperfect test. I have various estimates of se/sp which could inform my
priors (at least upper and lower limits even if with a uniform distribution)
and found the following code on this website..
2013 Mar 05
2
Function completely locks up my computer if the input is too big
Dear r-help,
Somewhere in my innocuous function to rotate an object in Cartesian space
I've created a monster that completely locks up my computer (requires a
hard reset every time). I don't know if this is useful description to
anyone - the mouse still responds, but not the keyboard and not windows
explorer.
The script only does this when the input matrix is large, and so my initial
2011 Jul 20
2
Bootstrap
Hi all,
I am facing difficulty on how to use bootstrap sampling and
below is my example of function.
Read a data , use some functions and use iteration to find the solution(
ie, convergence is reached). I want to use bootstrap approach to do it
several times (200 or 300 times) this whole process and see the
distribution of parameter of interest.
Below is a small example that resembles my
2005 Mar 16
1
Code to replace nested for loops
Dear list members,
How can I replace the nested for loops at then end of the script
below with more efficient code?
# Begin script__________________________________________________
# Dichotomous scores for 100 respondents on 3 items with
# probabilities of a correct response = .6, .4, and .7,
# respectively
x1 <- rbinom(100,1,.6)
x2 <- rbinom(100,1,.4)
x3 <- rbinom(100,1,.7)
#
2009 Sep 23
2
scaled Schoenfeld residuals
hi
sorry if this has been discussed before, but I'm wondering why the scaled
Schoenfeld residuals do not follow the defining formula for obtaining them
from the ordinary Schoenfeld residuals, but are instead offset by the
estimated parameter values.
e.g.
library(survival)
attach(ovarian)
sv<-Surv(futime,fustat)
f1<-coxph(sv~age+ecog.ps)
f1
2008 Apr 23
0
poly() can exceed degree k - 1 for k distinct points (PR#11251)
The poly() function can create more variables than can be fitted when
there are replicated values. In the example below, 'x' has only 5
distinct values, but I can apparently fit a 12th-degree polynomial with
no error messages or even nonzero coefficients:
R> x = rep(1:5,3)
R> y = rnorm(15)
R> lm(y ~ poly(x, 12))
Call:
lm(formula = y ~ poly(x, 12))
Coefficients:
2008 Apr 22
1
Bug in poly() (PR#11243)
Full_Name: Russell Lenth
Version: 2.6.2
OS: Windows XP Pro
Submission from: (NULL) (128.255.132.36)
The poly() function allows a higher-degree polynomial than it should, when
raw=FALSE.
For example, consider 5 distinct 'x' values, each repeated twice. we can fit a
polynomial of degree 8:
=====
R> x = rep(1:5, 2)
R> y = rnorm(10)
R> lm(y ~ poly(x, 8))
Call:
lm(formula = y ~
2012 Mar 14
0
using predict() with poly(x, raw=TRUE)
Dear r-devel list members,
I've recently encountered the following problem using predict() with a model
that has raw-polynomial terms. (Actually, I encountered the problem using
model.frame(), but the source of the error is the same.) The problem is
technical and concerns the design of poly(), which is why I'm sending this
message to r-devel rather than r-help.
To illustrate:
2002 Jul 03
0
poly.transform in R
Dear all,
I am trying to transform polynomial coefficients from orthogonal form to
the standard power basis. There's poly.transform in S-plus. Does anybody
know how to do that in R ? I've found question about that in the
archives of R-help but no real answer.
Example : I'm doing polynomial regression of percentage of one insect in
a community on altitude, precipitations,
2005 Feb 14
0
using poly in a linear regression in the presence of NA fails (despite subsetting them out)
I ran into a to me surprising result on running lm with an orthogonal
polynomial among the predictors.
The lm command resulted in
Error in qr(X) : NA/NaN/Inf in foreign function call (arg 1)
Error during wrapup:
despite my using a "subset" in the call to get rid of NA's.
poly is apparently evaluated before any NA's are subsetted out
of the data.
Example code (attached to
2007 Jan 25
1
poly(x) workaround when x has missing values
Often in practical situations a predictor has missing values, so that poly
crashes. For instance:
> x<-1:10
> y<- x - 3 * x^2 + rnorm(10)/3
> x[3]<-NA
> lm( y ~ poly(x,2) )
Error in poly(x, 2) : missing values are not allowed in 'poly'
>
> lm( y ~ poly(x,2) , subset=!is.na(x)) # This does not help?!?
Error in poly(x, 2) : missing values are not allowed in
2010 Aug 03
0
Issue with prediction from lm object with poly
DDear developeRs,
about a year ago, Alex Stolpovsky posted an issue with predict.lm on a
fit generated using poly with the raw=TRUE option and too few new data
(slightly modified reproducible example below). Alex did not get any
reply. I have just stumbled on the same problem, and I think that this
is a bug of function poly, which arises from the check whether the
polynomial degree is
2009 Nov 28
1
R function that duplicates Octave's poly function?
By any chance is anyone aware of an R function that duplicates Octave's poly function?
Here is a description of Octave's poly function:
Function File: poly (A)
If A is a square N-by-N matrix, `poly (A)' is the row vector of
the coefficients of `det (z * eye (N) - a)', the characteristic
polynomial of A. As an example we can use this to find the
eigenvalues
2008 Feb 13
1
use of poly()
Hi,
I am curious about how to interpret the results of a polynomial regression--
using poly(raw=TRUE) vs. poly(raw=FALSE).
set.seed(123456)
x <- rnorm(100)
y <- jitter(1*x + 2*x^2 + 3*x^3 , 250)
plot(y ~ x)
l.poly <- lm(y ~ poly(x, 3))
l.poly.raw <- lm(y ~ poly(x, 3, raw=TRUE))
s <- seq(-3, 3, by=0.1)
lines(s, predict(l.poly, data.frame(x=s)), col=1)
lines(s,
2005 Jun 29
1
poly() in lm() leads to wrong coefficients (but correct residuals)
Dear all,
I am using poly() in lm() in the following form.
1> DelsDPWOS.lm3 <- lm(DelsPDWOS[,1] ~ poly(DelsPDWOS[,4],3))
2> DelsDPWOS.I.lm3 <- lm(DelsPDWOS[,1] ~ poly(I(DelsPDWOS[,4]),3))
3> DelsDPWOS.2.lm3 <-
lm(DelsPDWOS[,1]~DelsPDWOS[,4]+I(DelsPDWOS[,4]^2)+I(DelsPDWOS[,4]^3))
1 and 2 lead to identical but wrong results. 3 is correct. Surprisingly
(to me) the residuals
2009 Jul 13
0
problem predict/poly
Dear R experts,
I am observing undesired behavior of predict(fit, newdata), in case when fit object is produced by lm() involving a poly(). Here is how to reproduce:
x <- c(1:10)
y <- sin(c(1:10))
fit <- lm(formula=y~poly(x, 5, raw=TRUE))
predict(fit, newdata=data.frame(x=c(1:10))) ## this works
predict(fit, newdata=data.frame(x=c(1:1))) ## this is broken, error below
Error in poly(x,
2006 Jun 13
1
poly(*,*) in lm() (PR#8972)
Full_Name: Jens Keienburg
Version: 2.3.0
OS: Windows XP
Submission from: (NULL) (193.174.53.122)
I used the function lm() to calculate the coefficients of a polynome. If I used
the function poly(t,2) to denote a polynome of form 1 + x + x^2, the
coefficients are wrong. I appended an excerpt below:
> t=1:100
> p=-20 - 10 * t + 2 * t^2
> p
[1] -28 -32 -32 -28 -20 -8 8
2009 Dec 17
1
poly() with unnormalized values
How can I get the result of, e.g., poly(1:3. degree=2) to give me the
unnormalized integer coefficients
usually used to explain orthogonal polynomial contrasts, e.g,
-1 1
0 -2
1 1
As I understand things, the columns of x^{1:degree} are first centered
and then
are normalized by 1/sqrt(col sum of squares), but I can't
see how to relate this to what is returned by poly().
>
2006 Sep 19
0
How to interpret these results from a simple gamma-frailty model
Dear R users,
I'm trying to fit a gamma-frailty model on a simulated dataset, with 6 covariates, and I'm running into some results I do not understand. I constructed an example from my simulation code, where I fit a coxph model without frailty (M1) and with frailty (M2) on a number of data samples with a varying degree of heterogeneity (I'm running R 2.3.1, running takes ~1 min).