Displaying 20 results from an estimated 200 matches similar to: "lme( y ~ ns(x, df=splineDF)) error"
2012 Sep 26
0
lme(y ~ ns(x, df=splineDF)) error
I would like to fit regression models of the form
y ~ ns(x, df=splineDF)
where splineDF is passed as an argument to a wrapper function.
This works fine if the regression function is lm(). But with lme(),
I get two different errors, depending on how I handle splineDF
inside the wrapper function.
A workaround is to turn the lme() command, along with the appropriate
value of splineDF, into a text
2012 Dec 06
1
scope, lme, ns, nlme, splines
I want to fit a series of lme() regression models that differ only in the
degrees of freedom of a ns() spline. I want to use a wrapper function to do
this. The models will be of the form
y ~ ns(x, df=splineDF)
where splineDF is passed as an argument to a wrapper function.
This works fine if the regression function is lm(). But with lme(),
I get an error. fitfunction() below demonstrates this.
2017 Jun 21
1
fitting cosine curve
Using a more stable nonlinear modeling tool will also help, but key is to get
the periodicity right.
y=c(16.82, 16.72, 16.63, 16.47, 16.84, 16.25, 16.15, 16.83, 17.41, 17.67,
17.62, 17.81, 17.91, 17.85, 17.70, 17.67, 17.45, 17.58, 16.99, 17.10)
t=c(7, 37, 58, 79, 96, 110, 114, 127, 146, 156, 161, 169, 176, 182,
190, 197, 209, 218, 232, 240)
lidata <- data.frame(y=y, t=t)
#I use the
2006 Apr 23
1
lme: null deviance, deviance due to the random effects, residual deviance
A maybe trivial and stupid question:
In the case of a lm or glm fit, it is quite informative (to me) to have
a look to the null deviance and the residual deviance of a model. This
is generally provided in the print method or the summary, eg:
Null Deviance: 658.8
Residual Deviance: 507.3
and (a bit simpled minded) I like to think that the proportion of
deviance 'explained' by the
2017 Jun 20
0
fitting cosine curve
Hi lily,
You can get fairly good starting values just by eyeballing the curves:
plot(y)
lines(supsmu(1:20,y))
lines(0.6*cos((1:20)/3+0.6*pi)+17.2)
Jim
On Wed, Jun 21, 2017 at 9:17 AM, lily li <chocold12 at gmail.com> wrote:
> Hi R users,
>
> I have a question about fitting a cosine curve. I don't know how to set the
> approximate starting values. Besides, does the method
2012 May 11
0
contrasts with an imbalance in a factor
Hi everybody,
I have an experiment examining risky choice behavior where two groups of subjects were unevenly divided across two different MRI scanners while they performed a task. Each subject's data was recorded once and only once on a particular scanner. The table describing the distribution of subjects across the scanner (3TE and 3TW) and groups is below.
3TE 3TW
Group1 10
2017 Jun 20
5
fitting cosine curve
Hi R users,
I have a question about fitting a cosine curve. I don't know how to set the
approximate starting values. Besides, does the method work for sine curve
as well? Thanks.
Part of the dataset is in the following:
y=c(16.82, 16.72, 16.63, 16.47, 16.84, 16.25, 16.15, 16.83, 17.41, 17.67,
17.62, 17.81, 17.91, 17.85, 17.70, 17.67, 17.45, 17.58, 16.99, 17.10)
t=c(7, 37, 58, 79, 96,
2012 Apr 18
0
Error in eval when using contrast and nlme
Hi everybody,
I've written a function to run an LME model on data derived from functional magnetic resonance images. When I run the function with contrasts included I get the following error
Error in eval(expr, envir, enclos) : object 'inModelFormula' not found
I think it has something do do with the way contrast evaluates arguments, but I've got no idea how to fix it. The code
2011 May 01
1
Different results of coefficients by packages penalized and glmnet
Dear R users:
Recently, I learn to use penalized logistic regression. Two packages
(penalized and glmnet) have the function of lasso.
So I write these code. However, I got different results of coef. Can someone
kindly explain.
# lasso using penalized
library(penalized)
pena.fit2<-penalized(HRLNM,penalized=~CN+NoSus,lambda1=1,model="logistic",standardize=TRUE)
pena.fit2
2017 Jun 21
2
fitting cosine curve
What I did was to plot your initial values, then plot the smoothed
values and guess the constants. That is, I got an "eyeball" fit to the
smoothed values. As I have described this as "gross cheating" in the
past, you should either split your data, estimate on one subset and
then test on another, or estimate on your data and test on a
replication. If you get pretty much the same
2001 May 24
0
nlme help please
I am trying to learn how to use nlme by working on a simple example. I
attach the data from a toy example I made up which is similar to my real
problem. (My grasp of fixed/random effects is still a bit tenuous)
It is a longitudinal study of the effect of two treatments: A and B. The
data were created by:
A: y<-12/(1+exp((2-time)/.5)),y<-8/(1+exp((2-time)/.5))
B:
2011 Feb 25
0
lme in loop help
Dear R users
I am new R user, execuse me I bother you, but I worked hard to find a
solution:
# data
ID <- c(1:100)
set.seed(21)
y <- rnorm(100, 10,2)
x1 <- rnorm(100, 10,2)
x2 <- rnorm(100, 10,2)
x3 <- rnorm(100, 10,2)
x4 <- rnorm(100, 10,2)
x5 <- rnorm(100, 10,2)
x6 <- rnorm(100, 10,2)
mydf <- data.frame(ID,y, x1,x2, x3, x4, x5, x6)
# just seperate analyis
2004 Mar 08
2
getting the std errors in the lm function
Hello,
I have a simple question for you:
making:
mylm<-lm(y~x)
summary(mylm)
I get the following results:
******************************************************
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 16.54087 0.19952 82.91 <2e-16 ***
x[1:19] -2.32337 0.04251 -54.66 <2e-16 ***
******************************************************
2003 Oct 11
1
Subclassing lm
I'd trying to subclass the "lm" class to produce a "mylm" class whose
instances behave like lm objects (are accepted by methods like summary.lm)
but have additional data or slots of my own design.
For starters:
setClass("mylm", "lm")
produces the somewhat cryptic:
Warning message:
Old-style (``S3'') class "mylm" supplied as a
2010 Sep 14
1
NA confusion (length question)
Hi folks,
I am running a very simple regression using
mylm <- lm(mass ~ tarsus, na.action=na.exclude)
I would like the use the residuals from this analysis for more
regression but I'm running into a snag when I try
cbind(mylm$residuals, mydata) # where my data is the original data set
The error tells me that it cannot use cbind because the length of
mylm$residuals is
2006 Apr 25
5
Heteroskedasticity in Tobit models
Hello,
I've had no luck finding an R package that has the ability to estimate a
Tobit model allowing for heteroskedasticity (multiplicative, for example).
Am I missing something in survReg? Is there another package that I'm
unaware of? Is there an add-on package that will test for
heteroskedasticity?
Thanks for your help.
Cheers,
Alan Spearot
--
Alan Spearot
Department of Economics
2017 Jun 21
1
fitting cosine curve
If you know the period and want to fit phase and amplitude, this is
equivalent to fitting a * sin + b * cos
> >>> > I don't know how to set the approximate starting values.
I'm not sure what you meant by that, but I suspect it's related to
phase and amplitude.
> >>> > Besides, does the method work for sine curve as well?
sin is the same as cos with
2017 Jun 21
0
fitting cosine curve
I'm trying the different parameters, but don't know what the error is:
Error in nlsModel(formula, mf, start, wts) :
singular gradient matrix at initial parameter estimates
Thanks for any suggestions.
On Tue, Jun 20, 2017 at 7:37 PM, Don Cohen <don-r-help at isis.cs3-inc.com>
wrote:
>
> If you know the period and want to fit phase and amplitude, this is
> equivalent to
2009 Mar 31
1
using "substitute" inside a legend
Hello list,
I have a linear regression:
mylm = lm(y~x-1)
I've been reading old mail postings as well as the plotmath demo and I came
up with a way to print an equation resulting from a linear regression:
model = substitute(list("y"==slope%*%"x", R^2==rsq),
list(slope=round(mylm$coefficients[[1]],2),rsq=round(summary(mylm)$adj.r.squared,
2)))
I have four models and I
2009 May 14
1
automated polynomial regression
Dear all -
We perform some measurements with a machine that needs to be
recalibrated. The best calibration we get with polynomial regression.
The data might look like follows:
> true_y <- c(1:50)*.8
> # the real values
> m_y <- c((1:21)*1.1, 21.1, 22.2, 23.3 ,c(25:50)*.9)/0.3-5.2
> # the measured data
> x <- c(1:50)
> # and the x-axes
>
> # Now I do the following: