similar to: Perhaps Off-topic lme question

Displaying 20 results from an estimated 300 matches similar to: "Perhaps Off-topic lme question"

2008 May 29
1
plotting zoo using datetime as xlim
is there a way to use the actual index value for plotting zoo objects this is the way that the index is set up and a sample range of what I would like to plot 01/01/06 00:00:00 - 01/01/06 23:45:00 { library(zoo) # chron library(chron) fmt.chron <- function(x) { chron(sub(" .*", "", x), gsub(".* (.*)", "\\1:00", x)) }} x <- structure(c(15.57, 15.5,
2017 Jun 18
0
R_using non linear regression with constraints
> On Jun 18, 2017, at 6:24 AM, Manoranjan Muthusamy <ranjanmano167 at gmail.com> wrote: > > I am using nlsLM {minpack.lm} to find the values of parameters a and b of > function myfun which give the best fit for the data set, mydata. > > mydata=data.frame(x=c(0,5,9,13,17,20),y = c(0,11,20,29,38,45)) > > myfun=function(a,b,r,t){ > prd=a*b*(1-exp(-b*r*t)) >
2017 Jun 18
2
R_using non linear regression with constraints
I am using nlsLM {minpack.lm} to find the values of parameters a and b of function myfun which give the best fit for the data set, mydata. mydata=data.frame(x=c(0,5,9,13,17,20),y = c(0,11,20,29,38,45)) myfun=function(a,b,r,t){ prd=a*b*(1-exp(-b*r*t)) return(prd)} and using nlsLM myfit=nlsLM(y~myfun(a,b,r=2,t=x),data=mydata,start=list(a=2000,b=0.05), lower = c(1000,0),
2017 Jun 18
3
R_using non linear regression with constraints
https://cran.r-project.org/web/views/Optimization.html (Cran's optimization task view -- as always, you should search before posting) In general, nonlinear optimization with nonlinear constraints is hard, and the strategy used here (multiplying by a*b < 1000) may not work -- it introduces a discontinuity into the objective function, so gradient based methods may in particular be
2017 Jun 18
0
R_using non linear regression with constraints
I ran the following script. I satisfied the constraint by making a*b a single parameter, which isn't always possible. I also ran nlxb() from nlsr package, and this gives singular values of the Jacobian. In the unconstrained case, the svs are pretty awful, and I wouldn't trust the results as a model, though the minimum is probably OK. The constrained result has a much larger sum of squares.
2020 Oct 17
2
??? is to nls() as abline() is to lm() ?
I'm drawing a fitted normal distribution over a histogram. The use case is trivial (fitting normal distributions on densities) but I want to extend it to other fitting scenarios. What has stumped me so far is how to take the list that is returned by nls() and use it for curve(). I realize that I can easily do all of this with a few intermediate steps for any specific case. But I had expected
2017 Jun 18
3
R_using non linear regression with constraints
I am not as expert as John, but I thought it worth pointing out that the variable substitution technique gives up one set of constraints for another (b=0 in this case). I also find that plots help me see what is going on, so here is my reproducible example (note inclusion of library calls for completeness). Note that NONE of the optimizers mentioned so far appear to be finding the true best
2012 Aug 25
2
Standard deviation from MANOVA??
Hi, I have problem getting the standard deviation from the manova output. I have used the manova function: myfit <- manova(cbind(y1, y2) ~ x1 + x2 + x3, data=mydata) . I tried to get the predicted values and their standard deviation by using: predict(myfit, type="response", se.fit=TRUE) But the problem is that I don't get the standard deviation values, I only
2017 Jun 18
0
R_using non linear regression with constraints
I've seen a number of problems like this over the years. The fact that the singular values of the Jacobian have a ration larger than the usual convergence tolerances can mean the codes stop well before the best fit. That is the "numerical analyst" view. David and Jeff have given geometric and statistical arguments. All views are useful, but it takes some time to sort them all out and
2020 Oct 17
0
??? is to nls() as abline() is to lm() ?
I haven't followed your example closely, but can't you use the predict() method for this? To draw a curve, the function that will be used in curve() sets up a newdata dataframe and passes it to predict(fit, newdata= ...) to get predictions at those locations. Duncan Murdoch On 17/10/2020 5:27 a.m., Boris Steipe wrote: > I'm drawing a fitted normal distribution over a
2008 Feb 24
1
what missed ----- CART
Hi all, Can anyone who is familar with CART tell me what I missed in my tree code? library (MASS) myfit <- tree (y ~ x1 + x2 + x3 + x4 ) # tree.screens () # useless plot(myfit); text (myfit, all= TRUE, cex=0.5, pretty=0) # tile.tree (myfit, fgl$type) # useless # close.screen (all= TRUE) # useless My current tree plot resulted from above code shows as:
2011 Jan 25
1
Predictions with 'missing' variables
Dear List, I think I'm going crazy here...can anyone explain why do I get the same predictions in train and test data sets below when the second has a missing input? y <- rnorm(1000) x1 <- rnorm(1000) x2 <- rnorm(1000) train <- data.frame(y,x1,x2) test <- data.frame(x1) myfit <- glm(y ~ x1 + x2, data=train) summary(myfit) all(predict(myfit, test) == predict(myfit, train))
2003 Oct 23
3
List of lm objects
Hi R-Helpers: I?m trying to fit the same linear model to a bunch of variables in a data frame, so I was trying to adapt the codes John Fox, Spencer Graves and Peter Dalgaard proposed and discused yesterday on this e-mail list: for (y in df[, 3:5]) { mod = lm(y ~ Trt*Dose, data = x, contrasts = list(Trt = contr.sum, Dose = contr.sum)) Anova(mod, type = "III") } ## by John Fox or for
2008 Mar 30
2
convert weekly time series data to monthly
I have weekly time series data with year, month, day, and price variables. The input data set for the weekly series takes the following form: Year month day price 1990 8 20 119.1 1990 8 27 124.5 1990 9 3 124.2 1990 9 10 125.2 1990 9 17 126.6 1990 9 24 127.2 1990 10 1 132.1 1990 10 8 133.3 1990 10 15 133.9 1990 10 22 134.5 1990 10 29 133.9 .. ... ... ... ... ... .... .... 2008 3 3 313.7 2008
2013 Aug 26
2
Partial correlation test
Dear all, I'm writing my manuscript to publish after analysis my final data with ANOVA, ANCOVA, MANCOVA. In a section of my result, I did correlation of my data (2 categirical factors with 2 levels: Quantity & Quality; 2 dependent var: Irid.area & Casa.PC1, and 1 co-var: SL). But as some traits (here Irid.area) are significantly influenced by the covariate (standard length, SL), I
2006 Mar 05
1
duration analysis
Hi, I am trying to estimate the effects of covariates on the hazard function, rather than on the survival. I know this is actually the same thing. For example, using the survival package, and doing: > myfit <- survreg( Surv(time, event) ~ mymodel ) all I have to do to get the quantities of my interest is > -myfit$coefficients/myfit$scale The standard erros are easily worked out, as
2007 Jan 21
1
for loop problem
Hello R users, A beginners question which I could not find the answer to in earler posts. My thought process: Here "z" is a 119 x 15 data matrix Step 1: start at column one, bind every column with column 1 Step2: use the new matrix, "test", in the fitCopula package Step3: store each result in myfit, bind each result to "answer" Step4: return "answer"
2010 Sep 19
2
working with eval and environments
I'm trying to get the following section of code to work, I think the problem is being caused by the assignment of data to the lm function not evaluating to "train" in the parent environment but I can't seem to figure out how to do this. fitmodel <- function(trial,data) { wrap.lm <- function(formula,data,...) { cat("in wrap lm",NROW(data),"\n");
2005 Jun 09
1
getting more than the coefficients
Hi there, I am trying to export a regression output to Latex. I am using the xtable function in the xtable library. Doing myfit <- lm(myformula, mydata) print.xtable(xtable(myfit), file="myfile") only returns the estimated coefficients and the correspondent standard erros, t-statiscs and p-values. But I wish to get a bit more, say, the number of observations used in the
2010 Jan 27
1
control of scat1d tick color in plot.Predict?
Hi All, I have a quick question about using plot.Predict now that the rms package uses lattice. I'd like to add tick marks along the regression line, which is given by data=llist(variablename) in the plot call. The ticks show up fine, but I'd like to alter the color. I know the ticks are produced by scat1d, but after spending a fair bit of time going through documentation, it still