similar to: lm#contrasts#one level in factor: bug or feature

Displaying 20 results from an estimated 5000 matches similar to: "lm#contrasts#one level in factor: bug or feature"

2003 Oct 23
3
List of lm objects
Hi R-Helpers: I?m trying to fit the same linear model to a bunch of variables in a data frame, so I was trying to adapt the codes John Fox, Spencer Graves and Peter Dalgaard proposed and discused yesterday on this e-mail list: for (y in df[, 3:5]) { mod = lm(y ~ Trt*Dose, data = x, contrasts = list(Trt = contr.sum, Dose = contr.sum)) Anova(mod, type = "III") } ## by John Fox or for
2004 Sep 09
2
Handling the windows clipboard/32KB limit
(R 1.9.1; Windows 2000;) I'm just comparing ease of use, speed, etc for methods of transferring data frames in the Excel, MySQL, R triangle. It turns out that going from Excel to R (when doing this carefully). Using the clipboard is actually quite fast and efficient (2 seconds for transferring 120 000 cells on a common desktop computer as compared to much longer for going the RODBC route,
2020 Oct 17
2
??? is to nls() as abline() is to lm() ?
I'm drawing a fitted normal distribution over a histogram. The use case is trivial (fitting normal distributions on densities) but I want to extend it to other fitting scenarios. What has stumped me so far is how to take the list that is returned by nls() and use it for curve(). I realize that I can easily do all of this with a few intermediate steps for any specific case. But I had expected
2020 Oct 17
0
??? is to nls() as abline() is to lm() ?
I haven't followed your example closely, but can't you use the predict() method for this? To draw a curve, the function that will be used in curve() sets up a newdata dataframe and passes it to predict(fit, newdata= ...) to get predictions at those locations. Duncan Murdoch On 17/10/2020 5:27 a.m., Boris Steipe wrote: > I'm drawing a fitted normal distribution over a
2008 Dec 11
2
call lattice function in a function passing "groups" argument
I'm trying to use a lattice function within a function and have problems passing the "groups" argument properly. Let's say I have a data frame d <- data.frame(x = rnorm(100), y = c("a", "b")) and want to plot variable x in a densityplot, grouped by the variable y, then I would do something like densityplot(~ x, d, groups = y) If however I wanted to
2004 Mar 19
2
using "unstack" inside my function: that old scope problem again
I've been reading the R mail archives and I've found a lot of messages with this same kind of problem, but I can't understand the answers. Can one of you try to explain this to me? Here's my example. Given a regression model and a variable, I want to use unstack() on the vector of residuals and make some magic with the result. But unstack hates me. PCSE <- function
2017 Jun 18
0
R_using non linear regression with constraints
> On Jun 18, 2017, at 6:24 AM, Manoranjan Muthusamy <ranjanmano167 at gmail.com> wrote: > > I am using nlsLM {minpack.lm} to find the values of parameters a and b of > function myfun which give the best fit for the data set, mydata. > > mydata=data.frame(x=c(0,5,9,13,17,20),y = c(0,11,20,29,38,45)) > > myfun=function(a,b,r,t){ > prd=a*b*(1-exp(-b*r*t)) >
2017 Jun 18
0
R_using non linear regression with constraints
I ran the following script. I satisfied the constraint by making a*b a single parameter, which isn't always possible. I also ran nlxb() from nlsr package, and this gives singular values of the Jacobian. In the unconstrained case, the svs are pretty awful, and I wouldn't trust the results as a model, though the minimum is probably OK. The constrained result has a much larger sum of squares.
2017 Jun 18
3
R_using non linear regression with constraints
https://cran.r-project.org/web/views/Optimization.html (Cran's optimization task view -- as always, you should search before posting) In general, nonlinear optimization with nonlinear constraints is hard, and the strategy used here (multiplying by a*b < 1000) may not work -- it introduces a discontinuity into the objective function, so gradient based methods may in particular be
2017 Jun 18
0
R_using non linear regression with constraints
I've seen a number of problems like this over the years. The fact that the singular values of the Jacobian have a ration larger than the usual convergence tolerances can mean the codes stop well before the best fit. That is the "numerical analyst" view. David and Jeff have given geometric and statistical arguments. All views are useful, but it takes some time to sort them all out and
2017 Jun 18
3
R_using non linear regression with constraints
I am not as expert as John, but I thought it worth pointing out that the variable substitution technique gives up one set of constraints for another (b=0 in this case). I also find that plots help me see what is going on, so here is my reproducible example (note inclusion of library calls for completeness). Note that NONE of the optimizers mentioned so far appear to be finding the true best
2008 May 23
3
Percentages for categorical data by group
I can think of several ways to blunt force hard code what I want but I imagine there is a command or two that can be easily combined to do this: I have a data frame with about 23000 observations. There first variable is the group to which the observation belongs (about 500 different groups). The second variable is a response for each observation that is a 1,2,3,4 or 5. I want to be able to
2012 Jan 01
1
How to pass in a list of variables as an argument to a function?
Hello, I have some code that currently works fine and I am endeavoring to convert the major pieces of it into functions. This involves taking "hard coded" names of variables that are used in various places and figuring out how to abstract them out into functions where the arguments (i.e. a list of variables)?can be passed to the parent function and used within that function for various
2012 Aug 25
2
Standard deviation from MANOVA??
Hi, I have problem getting the standard deviation from the manova output. I have used the manova function: myfit <- manova(cbind(y1, y2) ~ x1 + x2 + x3, data=mydata) . I tried to get the predicted values and their standard deviation by using: predict(myfit, type="response", se.fit=TRUE) But the problem is that I don't get the standard deviation values, I only
2008 Feb 24
1
what missed ----- CART
Hi all, Can anyone who is familar with CART tell me what I missed in my tree code? library (MASS) myfit <- tree (y ~ x1 + x2 + x3 + x4 ) # tree.screens () # useless plot(myfit); text (myfit, all= TRUE, cex=0.5, pretty=0) # tile.tree (myfit, fgl$type) # useless # close.screen (all= TRUE) # useless My current tree plot resulted from above code shows as:
2011 Jan 25
1
Predictions with 'missing' variables
Dear List, I think I'm going crazy here...can anyone explain why do I get the same predictions in train and test data sets below when the second has a missing input? y <- rnorm(1000) x1 <- rnorm(1000) x2 <- rnorm(1000) train <- data.frame(y,x1,x2) test <- data.frame(x1) myfit <- glm(y ~ x1 + x2, data=train) summary(myfit) all(predict(myfit, test) == predict(myfit, train))
2006 Mar 05
1
duration analysis
Hi, I am trying to estimate the effects of covariates on the hazard function, rather than on the survival. I know this is actually the same thing. For example, using the survival package, and doing: > myfit <- survreg( Surv(time, event) ~ mymodel ) all I have to do to get the quantities of my interest is > -myfit$coefficients/myfit$scale The standard erros are easily worked out, as
2007 Jan 21
1
for loop problem
Hello R users, A beginners question which I could not find the answer to in earler posts. My thought process: Here "z" is a 119 x 15 data matrix Step 1: start at column one, bind every column with column 1 Step2: use the new matrix, "test", in the fitCopula package Step3: store each result in myfit, bind each result to "answer" Step4: return "answer"
2002 Jul 03
2
lda from MASS function
Hi all, I am using the lda function from the MASS library to measure the discriminance of different variables with respect to different grouping variables by using lda( RESULTVARS[, 1:750] , GROUPVAR , tol=0 ) where RESULTVARS contains some 750 different variables. Occasionally there is a variable within RESULTVARS that has the same values for all values of GROUPVAR, ie no variance so
2010 Sep 19
2
working with eval and environments
I'm trying to get the following section of code to work, I think the problem is being caused by the assignment of data to the lm function not evaluating to "train" in the parent environment but I can't seem to figure out how to do this. fitmodel <- function(trial,data) { wrap.lm <- function(formula,data,...) { cat("in wrap lm",NROW(data),"\n");