similar to: Problem with anova.lmRob() "robust" package

Displaying 20 results from an estimated 200 matches similar to: "Problem with anova.lmRob() "robust" package"

2011 Jul 28
0
R: Re: Problem with anova.lmRob() "robust" package
I'm sorry, maybe the question was bad posed. Ista has well described my problem. Thanks Massimo >----Messaggio originale---- >Da: izahn at psych.rochester.edu >Data: 28/07/2011 17.52 >A: "David Winsemius"<dwinsemius at comcast.net> >Cc: "m.fenati at libero.it"<m.fenati at libero.it>, <r-help at r-project.org> >Ogg: Re: [R]
2009 Mar 12
1
zooreg and lmrob problem (bug?)
Hi all and thanks for your time in advance, I can't figure out why summary.lmrob complains when lmrob is used on a zooreg object. If the zooreg object is converted to vector before calling lmrob, no problems appear. Let me clarify this with an example: >library(robustbase) >library(zoo) >dad<-c(801.4625,527.2062,545.2250,608.2313,633.8875,575.9500,797.0500,706.4188,
2018 Mar 03
2
lmrob gives NA coefficients
Dear list members, I want to perform an MM-regression. This seems an easy task using the function lmrob(), however, this function provides me with NA coefficients. My data generating process is as follows: rho <- 0.15 # low interdependency Sigma <- matrix(rho, d, d); diag(Sigma) <- 1 x.clean <- mvrnorm(n, rep(0,d), Sigma) beta <- c(1.0, 2.0, 3.0, 4.0) error <- rnorm(n = n,
2010 Dec 13
1
Wrong contrast matrix for nested factors in lm(), rlm(), and lmRob()
This message also reports wrong estimates produced by lmRob.fit.compute() for nested factors when using the correct contrast matrix. And in these respects, I have found that S-Plus behaves the same way as R. Using the three available contrast types (sum, treatment, helmert) with lm() or lm.fit(), but just contr.sum with rlm() and lmRob(), and small examples, I generated contrast matrices for
2018 Mar 03
0
lmrob gives NA coefficients
> On Mar 3, 2018, at 3:04 PM, Christien Kerbert <christienkerbert at gmail.com> wrote: > > Dear list members, > > I want to perform an MM-regression. This seems an easy task using the > function lmrob(), however, this function provides me with NA coefficients. > My data generating process is as follows: > > rho <- 0.15 # low interdependency > Sigma <-
2018 Mar 04
2
lmrob gives NA coefficients
Thanks for your reply. I use mvrnorm from the *MASS* package and lmrob from the *robustbase* package. To further explain my data generating process, the idea is as follows. The explanatory variables are generated my a multivariate normal distribution where the covariance matrix of the variables is defined by Sigma in my code, with ones on the diagonal and rho = 0.15 on the non-diagonal. Then y
2018 Mar 04
0
lmrob gives NA coefficients
What is 'd'? What is 'n'? On Sun, Mar 4, 2018 at 12:14 PM, Christien Kerbert < christienkerbert at gmail.com> wrote: > Thanks for your reply. > > I use mvrnorm from the *MASS* package and lmrob from the *robustbase* > package. > > To further explain my data generating process, the idea is as follows. The > explanatory variables are generated my a
2009 Apr 08
1
predict "interval" for lmRob?
lm's "predict" function offers an "interval" parameter to choose between 'confidence' and 'prediction' bands. In the package "robust" and for "lmRob", there is also a "predict" but it lacks such a parameter, and the documented "type" parameter has only "response" offerred. Is there some way of obtaining
2018 Mar 04
1
lmrob gives NA coefficients
d is the number of observed variables (d = 3 in this example). n is the number of observations. 2018-03-04 11:30 GMT+01:00 Eric Berger <ericjberger at gmail.com>: > What is 'd'? What is 'n'? > > > On Sun, Mar 4, 2018 at 12:14 PM, Christien Kerbert < > christienkerbert at gmail.com> wrote: > >> Thanks for your reply. >> >> I use
2007 Nov 16
1
Question about lmRob
Hi, I am trying to fit a ANCOVA model using lmRob. The P-values of the variables in the model differ hugely between the summary() function and the anova() function (from >0.8 in the summary to <0.001in the anova for the same variable). I understand that with an ANCOVA the order in which the variables are added to the model matters and that this influences the P-value, but can this make such
2013 May 17
2
zigzag confidence interval in a plot
Dear All, When I plot the values and linear regression line for one data set, it is fine. But for another one I see zigzags, when I plot the confidence interval >cd Depth CHAOsep12RNA 9,94 804 25,06 1476,833333 40,04 1540,561404 50,11 1575,166667 52,46 349,222222 54,92 1941,5 57,29 1053,507042 60,11 1535,1 70,04 2244,963303 79,97 1954,507042 100,31 2679,140625 >
2011 Mar 16
0
cross validation? when rlm, lmrob or lmRob
Dear community, I have fitted a model using comands above, (rlm, lmrob or lmRob). I don't have new data to validate de models obtained. I was wondering if exists something similar to CVlm in robust regression. In case there isn't, any suggestion for validation would be appreciated. Thanks, user at host.com -- View this message in context:
2008 Jan 11
0
Behaviour of standard error estimates in lmRob and the like
I am looking at MM-estimates for some interlab comparison work. The usual situation in this particular context is a modest number of results from very expensive methods with abnormally well-characterised performance, so for once we have good "variance" estimates (which can differ substantially for good reason) from most labs. But there remains room for human error or unexpected chemistry
2013 Apr 03
0
Help with lmRob function
Hi, I am fairly new to R and have encountered an issue with the lmRob function that I have been unable to resolve. I am trying to run a robust regression using the lmRob function which runs successfully, but the results are rather strange. I'm not sure it's important, but my model has 3 dichotomous categorical variables and 2 continuous variables in it. When I look at a summary of my
2008 May 14
1
rlm and lmrob error messages
Hello all, I'm using R2.7.0 (on Windows 2000) and I'm trying do run a robust regression on following model structure: model = "Y ~ x1*x2 / (x3 + x4 + x5 +x6)" where x1 and x2 are both factors (either 1 or 0) and x3.....x6 are numeric. The error code I get when running rlm(as.formula(model), data=daymean) is: error in rlm.default(x, y, weights, method = method, wt.method =
2018 Mar 04
0
lmrob gives NA coefficients
Hard to help you if you don't provide a reproducible example. On Sun, Mar 4, 2018 at 1:05 PM, Christien Kerbert < christienkerbert at gmail.com> wrote: > d is the number of observed variables (d = 3 in this example). n is the > number of observations. > > 2018-03-04 11:30 GMT+01:00 Eric Berger <ericjberger at gmail.com>: > >> What is 'd'? What is
2013 May 08
1
How to calculate Hightest Posterior Density (HPD) of coeficients in a simple regression (lm) in R?
Hi! I am trying to calculate HPD for the coeficients of regression models fitted with lm or lmrob in R, pretty much in the same way that can be accomplished by the association of mcmcsamp and HPDinterval functions for multilevel models fitted with lmer. Can anyone point me in the right direction on which packages/how to implement this? Thanks for your time! R. [[alternative HTML version
2013 Jan 21
1
lmomco package - Random number generation using Wakeby distribution
Dear R forum >From the given data, I have estimated the parameters of Wakeby distribution using lmomco package as library(lmomco) (amounts <- read.csv("input_S.csv")$amount) # ___________________________________________________________ # Wakeby distribution - Parameter estimation N                      = length(amounts) lmr                    = lmom.ub(amounts)
2011 Mar 04
1
linear model - lm (Adjusted R-squared)?
Hi, Sorry for the naive question, but what exactly does the 'Adjusted R-squared' coefficient in the summary of linear model adjust for? Sample code: > x <- rnorm(15) > y <- rnorm(15) > lmr <- lm(y~x) > summary(lmr) Call: lm(formula = y ~ x) Residuals: Min 1Q Median 3Q Max -1.7828 -0.7379 -0.4485 0.7563 2.1570 Coefficients:
2018 Apr 06
1
Fast tau-estimator line does not appear on the plot
R-experts, I have fitted many different lines. The fast-tau estimator (yellow line) seems strange to me?because this yellow line is not at all in agreement with the other lines (reverse slope, I mean the yellow line has a positive slope and the other ones have negative slope). Is there something wrong in my R code ? Is it because the Y variable is 1 vector and should be a matrix ? Here is the